Skip to main content

Uncertainty in scientific assessments

We can never be completely certain about the future, either in science, or in everyday life. Even when there is strong evidence that something will happen, there will almost always be uncertainty Scientific concept used in risk assessment to describe all types of limitations in available knowledge at the time an assessment is conducted, with the agreed resources, that affect the probability of possible outcomes to the assessment. about the outcome. But by taking account of this uncertainty, we often can make better, more transparent decisions about things that may affect the outcome.

 

Background

Assessing and taking account of uncertainties is a normal part of scientific work and of everyday life. For example, meteorologists review satellite images to make predictions about the weather. They are rarely 100 per cent certain what will happen. So when they make a forecast, they usually indicate how likely it is. If they say there is a “strong chance” of rain, you will probably decide to take your umbrella when you go outside. If the chance of rain is “slight”, you are more likely to decide to leave your umbrella at home. If the forecaster uses percentages – a 90% or 10% chance of rain – for many of us the message becomes even clearer.

The same principles apply to food safety. For example, scientists may be asked to assess the safety of a new food, pesticide Substance used to kill or control pests, including disease-carrying organisms and undesirable insects, animals and plants. or food-borne bacteria. When evidence or knowledge is incomplete, they try to explain how the uncertainty may affect their conclusions.

They carry out an “uncertainty assessment” to identify and describe the scientific uncertainties, and explain the implications for decision-making. They may indicate whether or not there is more than one possible outcome and the relative likelihood of each.

As with the weather forecast, how certain you are (e.g. 10%, 50% or 90% certain) is important information for decision-making. Such information becomes crucial if the decisions have serious consequences for the health of people, animals and the environment.

Latest

August 2023 A new multimedia tutorial provides a practical introduction to how to use EFSA’s guidance documents on uncertainty analysis A method of identifying the sources of uncertainty in a risk assessment calculation and estimating their size and direction so that errors can be taken into account. and communication in scientific assessments. The tutorial includes video explanations set within a dashboard that gives access to the key sections of these documents.

Milestones

  1. 2021

    November

    EFSA and  DG SANTE  (European Commission Directorate-General  for Health and Food Safety) jointly hold a two-day training workshop on uncertainty with risk managers entitled “Degree of certainty in scientific advice: implications for risk management and communication”. Risk managers from the Commission and Member State authorities take part, highlighting the usefulness of accessible uncertainty information and making suggestions for further improvements.

  2. 2019

    January

    EFSA publishes guidance on communicating uncertainty in scientific assessments. The guidance is a companion to the technical EFSA Scientific Committee guidance on uncertainty analysis in scientific assessments from 2018. EFSA is gradually implementing these two new guidance documents for assessors and communicators.

    EFSA’s new approach to communicating scientific uncertainties was made possible by combining the expertise of social scientists, natural scientists and communicators. The experts’ knowledge of social research on people’s understanding of uncertainties and their ability to apply this in a food safety context was critical in developing this new communications methodology.

  3. 2018

    Autumn

    EFSA implements its guidance on uncertainty analysis in two stages: in general scientific areas from autumn 2018, in regulated products areas to be phased in over coming years.

  4. January

    Publication of guidance on uncertainty analysis in scientific assessments, EFSA’s harmonised approach to assessing and taking account of uncertainties in food safety, and animal and plant health.

  5. 2017

    Social scientists join EFSA’s group of experts, specifically to produce guidance for communicators as a companion to the guidance for assessors.

  6. December

    An EFSA workshop with assessors and risk managers follows the trialling of EFSA’s draft guidance on uncertainty in scientific assessment, helping experts to finalise the new harmonised approach.

  7. 2016

    March

    EFSA’s Scientific Panels start to trial the revised draft guidance on at least one of their scientific assessments. Feedback from the public consultation in 2015 helped EFSA’s experts to revise and clarify key aspects of the previous draft.

  8. 2015

    June

    EFSA publicly consults on its draft guidance on Uncertainty in Scientific Assessment. The document proposes a new standardised toolbox of methodologies for analysing, explaining and accounting for uncertainties in scientific assessments.

    Leading experts and practitioners of regulatory science from across Europe and the globe take part in a workshop organised by EFSA to gather feedback and insights on its on-going efforts to harmonise and strengthen the cross-cutting methodologies that underpin its scientific assessments.

  9. 2013

    EFSA’s Scientific Committee requests a  self-task The process whereby, during the course of its regular work, EFSA identifies an issue worthy of further consideration. mandate to develop guidance on uncertainty in scientific assessment as part of a major push to increase robustness, transparency and openness of scientific assessments.

  10. 2009

    EFSA’s Scientific Committee publishes its general principles to ensure the transparency of  risk assessment  A specialised field of applied science that involves reviewing scientific data and studies in order to evaluate risks associated with certain hazards. It involves four steps: hazard identification, hazard characterisation, exposure assessment and risk characterisation., including the need to identify and characterise uncertainties.

  11. 2007

    January

    EFSA’s Scientific Committee publishes a  scientific opinion Opinions include risk assessments on general scientific issues, evaluations of an application for the authorisation of a product, substance or claim, or an evaluation of a risk assessment. related to uncertainties in dietary exposure assessment.

EFSA's role

EFSA’s Scientific Committee develops harmonised risk assessment methodologies on scientific matters of a horizontal nature in the fields within EFSA's remit where EU-wide approaches are not already defined.

EFSA asked the Scientific Committee to develop guidance on how to characterise, document and explain uncertainties in risk assessment. This covers uncertainties at the various steps of risk assessment A scientifically-based process consisting of four steps: hazard identification, hazard characterisation, exposure assessment and risk characterisation., i.e. hazard identification The first step in risk assessment, this involves the identification of biological, chemical, and physical agents capable of causing adverse health effects. and characterisation, exposure assessment One of the key steps in risk assessment, this relates to a thorough evaluation of who, or what, has been exposed to a hazard and a quantification of the amounts involved. and risk characterisation The final stage of risk assessment, in which the likelihood that a particular substance will cause harm is calculated in the light of the nature of the hazard and the extent to which people, animals, plants and/or the environment are exposed to it.. The harmonised approach is applicable to all relevant working areas of EFSA, but phased in at different stages.

While developing its guidance for assessors, the Scientific Committee recognised the key function played by communication in the dialogue between assessors and decision-makers. Consequently, social scientists joined the Scientific Committee working group on uncertainty to develop a companion guidance for communicators.

FAQ

Science is the pursuit of knowledge. Scientists are constantly striving to fill in the gaps in human knowledge about how the world works. They often know a great deal about their specialist fields; they also know a lot about what is not known. Their confidence in their conclusions rests on the quality of the available scientific evidence, their experience and judgment in interpreting the evidence and their understanding of the possible impact of what they do not know (i.e. the uncertainty).

Identifying and describing scientific uncertainties, and explaining their implications for assessment conclusions, are crucial parts of providing transparent scientific advice. When dealing with uncertainty, decision-makers need to know what the different outcomes might be and how likely they are. How scientists report uncertainties and how public bodies such as EFSA communicate them to decision-makers, stakeholders and the wider public can alter perceptions about the risks and benefits of assessments and impact on related policy decisions. This can also directly or indirectly affect the choices made by individuals.

Risk assessors such as EFSA are responsible for describing uncertainty to decision-makers and other stakeholders when providing scientific advice. Decision-makers are responsible for resolving the impact of uncertainty on their decisions, i.e. deciding whether and in what way decision-making should take account of the uncertainty.

Scientists routinely strive to address a wide range of factors that can create uncertainty in their scientific assessments. EFSA’s Scientific Committee defines uncertainty as referring to “all types of limitations in the knowledge available to assessors at the time an assessment is conducted and within the time and resources available for the assessment”. Examples include:

  • Possible limitations in the quality and representativeness of data.
  • Comparing non-standardised data across countries or categories.
  • Choosing one predictive modelling technique over another.
  • Using default factors (such as the weight of an average adult).

Qualifying uncertainty with terms such as “negligible”, “low” or “high” can give a sense of the degree of certainty of an assessment outcome. But such terms are interpreted differently by different people. Quantifying uncertainty, for example, on a percentage scale is more effective because it reduces the room for ambiguity. It also helps that quantitative methods are generally more technically rigorous than qualitative methods. Quantifying uncertainty, therefore, is both more robust and provides a clearer picture for decision-makers.

Probability is the natural measure for expressing and understanding the relative likelihood of outcomes. EFSA’s Scientific Committee endorsed a scale (developed by the Intergovernmental Panel on Climate Change) for quantifying the probability The likelihood that a particular event will occur or that a measured value will fall within a particular range. of uncertain outcomes. Probability scale (IPCC, revised)

Probability termSubjective probability range
Extremely likely99-100 %
Very likely90-99 %
Likely66-90 %
As likely as not33-66 %
Unlikely10-33 %
Very unlikely1-10 %
Extremely unlikely0-1 %

For communicating these probabilities to non-technical audiences, we refer to our expert’s certainty (i.e. their “confidence”) regarding their conclusion. For example, the scientific assessors may say that their conclusion is 90-99% certain (very likely), in which case decision-makers and the public may be expected to have a high degree of confidence in measures that are in line with that conclusion. If the outcome is 33-66% certain (as likely as not), the decision-maker may be less persuaded depending on the greater weight of other non-scientific factors (e.g. social or economic) and may be more inclined to take precautionary measures unless there is scope to reduce the uncertainty (e.g. through new research). If assessors consider a conclusion is 1-10% certain (very unlikely), decision-makers may give the scientific advice little weight when choosing how to proceed.

Quantifying uncertainties raises several challenges but it is not impossible. There are different quantitative methods for characterising uncertainty. EFSA’s revised draft guidance on uncertainty describes about 10 quantitative methods in detail. The choice of method may depend on such factors as the types of uncertainty identified and the expertise and time available for the assessment. Many data-related uncertainties such as limited sample size and measurement error can be quantified relatively easily using established statistical tools. In other cases, expert judgement will be needed and, although subjective, can be a great strength of scientific assessments if well-reasoned. EFSA published separate guidance on formal approaches to obtaining expert judgements in 2014, and is developing training for experts in making probability judgements. Whatever the method, it is important to describe clearly why and how each method was used.

No, it’s never possible to quantify ‘unknown unknowns’ – uncertainties that we are not yet aware of – and even some of the known unknowns may be too complex or difficult for experts to quantify. EFSA’s Scientific Panels are asked to quantify as many as possible of the uncertainties affecting their assessments, and describe qualitatively those they can identify but not quantify.

No, EFSA’s proposed approach is flexible and offers a selection of tools to adapt to the circumstances of each assessment. The time devoted to uncertainty would understandably be limited in an urgent situation where advice could be needed in a matter of hours (although crucial to address as uncertainty is often greatest in such situations). More effort could be dedicated to assessing uncertainties during a longer-term comprehensive review of all available scientific knowledge. Likewise, different approaches would apply to well-studied issues with fewer uncertainties than those at the forefront of scientific knowledge where evidence may be scarcer.

The guidance is aimed primarily at the experts on EFSA’s Scientific Panels and their working groups, EFSA scientific staff and scientific organisations carrying out scientific work on EFSA’s behalf. It is also relevant for risk managers in the European Commission and EU Member States who take decisions on the basis of EFSA’s scientific advice. Once finalised, the guidance will apply to all areas of EFSA’s work and all types of scientific assessment, including risk assessment and all its constituent parts ( hazard A substance or activity which has the potential to cause adverse effects to living organisms or environments. identification and characterisation, exposure Concentration or amount of a particular substance that is taken in by an individual, population or ecosystem in a specific frequency over a certain amount of time. assessment and risk characterisation).

Uncertainty assessment requires expert training both for assessors and for the decision-makers who use the assessments. EFSA is providing training to its scientists and working with EU risk managers as well as other European and international risk assessors to promote a harmonised understanding of uncertainty assessment.

In June 2015, EFSA called on the international scientific community, European and national risk assessors, risk communicators and risk managers, as well as EFSA’s stakeholders to provide feedback on its proposed systematic approach to uncertainty assessment. Input from other scientific advisory bodies as well as academic or applied experts in uncertainty analysis, particularly on the proposed methods contained in the tool box, was needed to strengthen the draft before EFSA began to trial the approach across the full food safety panorama.

Tutorials

EFSA promotes understanding and the practical implementation of its guidance documents on uncertainty analysis and communication in scientific assessments. Alongside regular training sessions for EFSA’s scientific experts, members of EFSA networks, EFSA staff and risk managers at the EU institutions and in Member States, we have developed a series of tutorials that can be used by anyone with an interest or need to learn how to follow these approaches.

  • Multimedia tutorial: this dashboard provides a practical introduction to how to use EFSA’s guidance documents on uncertainty analysis and communication in scientific assessments. As well as a guide to the key sections of the two guidance documents, you can access EFSA opinions for which the guidance has been implemented.
EFSA Uncertainty Tutorial
Watch it now

A series of video tutorials tailored for chemical risk assessment, e.g. pesticides, food additives, contaminants, and non-chemical risk assessment, e.g. biological hazards, animal health, plant health are also available from previous e-learning courses on how to perform uncertainty assessments.