Skip to main content

Selecting and using evidence: EFSA increases rigour, enhances consistency

A new report is intended to further improve EFSA’s methods for dealing with data and evidence in its scientific assessments, and to shed new light on how the Authority develops its scientific outputs.

“With this approach we are improving methodological rigour and enhancing consistency across EFSA of the steps we take to select evidence and to show how we decide which evidence is used, or not used – and why. This will make it easier to follow how evidence contributes to the final assessment and how we report the entire process and results,” stated Dr Marta Hugas, Head of Risk Assessment and Scientific Assistance at EFSA.

The report signals completion of the first stage of EFSA’s on-going “Promoting methods for evidence use in scientific assessments” initiative. “Ultimately, this will improve the quality of our scientific work and communication of the outcomes to decision-makers, other scientists and EFSA’s stakeholders,” added Dr Hugas.

The report published today underlines impartiality, excellence (methodological quality), transparency, openness and responsiveness (fitness-for purpose) as the Authority’s guiding principles for selecting and using evidence in scientific assessments. It also details each step of the process necessary to abide by them:

  • Upfront planning of the assessment strategy, defining the relevant data and the approach for collecting, appraising and integrating them
  • Carrying out the scientific assessment in line with the plan, and independently of prior knowledge of the results of the available studies
  • Verifying the process, to ensure alignment with the plan and the guiding principles
  • Documenting and reporting of all steps, including deviations from the original plan

EFSA’s Executive Director Bernhard Url welcomed the approach as an important part of the Authority’s drive to be a more Open EFSA: “By clearly stating these principles and defining this process, EFSA is further enhancing the quality and the transparency of its scientific assessments.” Thorough upfront planning followed by coherent implementation will “make it easier to follow our decision-making and increase trust in the Agency’s scientific advice,” he said.

While primarily aimed at EFSA’s expert Panels and scientific staff, the principles and the process mapped out in this report should be applied by scientific organisations carrying out work on EFSA’s behalf. This framework could also be applied by risk assessors across Europe and beyond.

A second report containing an analysis of the methods for dealing with evidence applied by EFSA in generic scientific assessments and those in the area of regulated products is scheduled in late 2016.

An easy-to-read infographic shows simply how the ‘Promoting methods for evidence use’ approach works in practice: