[HORIZON EUROPE] ERC's president explains recent changes in the evaluation of proposals

ERC President Maria Leptin has published a report entitled "The evaluation of research proposals: the why and what of the ERC's recent changes".

The ERC no longer specifies that it seeks proposals for “high-risk, high-gain” projects. In the report, ERC president Maria Leptin said this phrase “was seen as potentially confusing and problematic”.

She stressed that the ERC continues to look for proposals that address important challenges and hope that the research funded by the ERC will lead to major advances at the frontier of knowledge. However, the terms ‘ambitious’, ‘creative and original’ are better descriptors for the kinds of proposals the ERC should fund.

In addition, the ERC is now putting more emphasis on projects than their proposers, and no longer numerically grading proposals for both the planned work and the applicant themselves.

In this article we will provide a short summary of the 11-pages report.

Introduction

Recently, the Scientific Council of the ERC has introduced changes in the evaluation processes and evaluation forms for the 2024 calls for research proposals, as described in the ERC ‘Work Programme 2024’ and the associated guidance documents. This report describes the changes, the discussions that led to them, and the reasoning behind them.

In the case of the ERC, project proposals are judged on excellence in creativity, originality and potential for significant advances in knowledge - or, to use the wording of the ERC work programme: “the ground-breaking nature, ambition and feasibility of the proposal”.

Evaluation of the proposed project

During the discussions of the ERC Scientific Council, it was recognized that some terms that had previously been in use may not be fit for purpose. The term ‘high-risk, high-gain’ was seen as potentially confusing and problematic. This concept is often invoked to discourage evaluation panels from conservatism in their choice of what to fund. Indeed, the possibility that a project will not fulfil its aims is inherent in frontier research, but this possibility means precisely that the results cannot be predicted. On the other hand, a researcher who, for example, has already established with preliminary data that an exciting new approach is likely to work, may be able to carry out ground-breaking work with a relatively high chance of success. 

Evaluation of the applicant

The emphasis of the assessment of the PI should continue to be on whether they had demonstrated the ability to carry out ambitious and challenging research and had thereby contributed to advancing knowledge in their field. The only way to assess this in the first instance is through their track record in terms of research outputs, and indirectly, by the recognition they receive from their peers.

Supervision of graduate students or postdoctoral researchers

Listing the numbers of PhD-students and Postdocs is not sufficient to assess whether a PI has been a good advisor for the members of their research team. 

The Scientific Council was unable to come up with any other reliable and fair measure for ‘good mentorship’ and concluded that this information should no longer be asked for. 

Extramural funding

A wide range of factors can influence this parameter, including the availability of grants in different national settings and for different types of research. Some PIs have generous institutional funding and may never have needed or wanted to apply for grants in the past. Attraction of extramural funding is therefore another proxy that does not necessarily measure the importance of a researcher’s work. 

The ERC therefore considers this point only to ensure that the proposed project is not already funded from other sources, and only at the second step of evaluation.

Sections of the new template

RESEARCH ACHIEVEMENTS AND PEER RECOGNITION

This is what the Scientific Council considers to be the most important part of the track record: here the applicants provide the evidence for their ability to carry out demanding and original research. Evidence of peer recognition can help evaluators complement the view of the applicant they have formed based on the research outputs. 

Research achievements

The number of outputs is limited to ten (with an emphasis on more recent ones), and it is no longer specified what format such outputs should or might have. Thus, it is possible to list, for example, datasets, open-source code or software that are widely used, expeditions that yielded important data, granted patents, prototypes, or any other type of major research output.The new template therefore encourages the applicants to provide such explanations in brief narratives. 

Peer recognition

The new templates no longer ask for any specific elements of peer recognition, but leave it open to the applicant what to list, and to use the narrative component to explain the context and the significance of the listed items.

Narrative elements

The Scientific Council felt that voluntary narrative elements can provide a more comprehensive view of a researcher's career, contributions, and potential. This is particularly the case when they are used to complement other assessment tools and metrics. They can highlight important aspects of a researcher's work that may not be captured by traditional metrics. The responsibility for selecting and explaining the research outputs and elements of peer recognition is thus left entirely to the applicant.

OTHER CONTRIBUTIONS

Engagement in peer review, teaching, academic leadership and other contributions

Most researchers are engaged in academic activities that do not directly contribute to their research. 

Particularly noteworthy contributions to teaching and other outstanding contributions to the research community should be listed to provide context in the assessment of applicants’ research achievements and peer recognition, even if they do not directly enter the evaluation of these elements.

Weighting of the assessment of the proposed project and the assessment of the applicant

The focus of the evaluation should be on the scientific content of the proposal. In the past, both the proposal and the applicant were numerically graded in the first step of the evaluation, on equal scales. As a result, the application from an apparently ‘strong’ PI with a weak proposal could end up with a similar combined score as one from a less accomplished PI with a brilliant proposal. This exposes the evaluation to a higher risk of unconscious bias.

Only the project is scored on a numerical scale, and only this score can be used to rank the list of proposals before the panel discussion. The applicant is given an overall qualitative assessment with five options (outstanding / excellent / very good / good / non-competitive), which is not converted into a numerical score and is not combined with the score for the research project. In this way, the evaluation should give more weight to the project than to the applicant. This has been a practice in most ERC panels already, and it is now explicitly indicated in the ERC Work Programme. 

Implementation of changes, guidance to applicants and evaluators

The 90 peer review panels of the ERC (28 each for Starting, Consolidator and Advanced Grants, five for Synergy Grants, and one for Proof-of-Concept Grants) that meet each year decide independently on the final ranking of the proposals submitted to their panels. 

Applicants and panel members must have a clear understanding of what is expected of them and, in particular, the same understanding of how and for what purpose any element of information from the applicant is used by the panel for the evaluation. Explicit guidance on evaluation elements will also help to level the playing field for all applicants, regardless of their background or prior familiarity with ERC grants.

All applicants are now asked ‘to provide a list of up to ten research outputs […] with an emphasis on more recent achievements’ on the assumption that panels will be able to judge which ‘recent’ period is appropriate for any given CV and whether a particular achievement was relevant to the application.

Conclusion: an ongoing process

The Scientific Council continuously solicits input from the evaluation panels, and we have now set up a procedure for regularly responding to the input and taking action where necessary.

The effects of the changes we have made will be closely monitored and could be refined in future following feedback from the applicants, panel members, scientific officers of the ERC Executive Agency and the scientific community.

Read the whole report

The report can be downloaded from the ERC newspage.

Considering to apply for an ERC grant?

Ask for individual guidance.  Ghent University's EU team is very experienced in ERC grants.

23 februari 2024 09:22