Principles for good evidence synthesis for policy have been developed by the Royal Society and the Academy of Medical Sciences, with input from a range of experts. 

They outline the fundamental features of good evidence synthesis regardless of the precise timeframe, topic or methods.

They are intended as a set of guidelines for synthesising evidence and a checklist against which the quality of the synthesis process and results can be assured.

We encourage researchers, policymakers, brokerage organisations and others to apply these principles when undertaking or commissioning evidence synthesis to inform policy. By promoting best practice among a range of organisations and sectors the role of high-quality synthesised evidence in supporting well-founded policy decisions and public debate can be maximised.


  • Involves policymakers and is relevant and useful to them
  • Considers many types and sources of evidence 
  • Uses a range of skills and people

Evidence synthesis that involves policymakers throughout – from the design of the research question to the interpretation of findings – is most likely to yield significant policy insights. Keeping the process inclusive makes it more likely that it will identify the full range of relevant evidence types, sources and expertise. Teams of contributors should have a mix of skills in synthesis and could include some or all of the following: policymakers, practitioners, subject experts, statisticians, experts in databases and search terms, objective writers (usually non-subject experts), and independent reviewers. In practice, policymakers may be less involved during parts of the process if the aim is to scan the horizon for future priorities or to synthesise evidence on a topic that is yet to attract major policy interest.

Science Advice Group for Emergencies (SAGE) – an inclusive case-study

The UK government's Science Advice Group for Emergencies (SAGE) provides scientific and technical advice to inform government decision making during emergencies. Depending on the situation, many different types of evidence and expertise may need to be rapidly synthesised. This requires good networks and relationships between government bodies and external stakeholders.

A combination of factors made the 2013 Ebola outbreak in East Africa very difficult to control. Ebola has a high mortality rate and, in this instance, spread quickly due to poverty, limited healthcare facilities, local burial customs, and a distrust of the government and healthcare officials. A rapid synthesis exercise – including consultation with infectious disease experts, anthropologists, behavioural scientists and historians – informed the government’s response in both the UK and Africa

Rapid synthesis of scientific and other evidence was also required in 2011 when a magnitude 9 earthquake hit the east coast of Japan, leading to a power failure at the Fukushima nuclear plant. SAGE was convened, bringing together experts from within government (the Office for Nuclear Regulation, Health Protection Agency and Department of Health) and outside (the National Nuclear Laboratory, industry and academia). The evidence from the group was used to inform the advice issued to British Nationals in Japan.


  • Uses the most comprehensive feasible body of evidence 
  • Recognises and minimises bias
  • Is independently reviewed as part of a quality assurance process

Researchers should be as comprehensive as possible in identifying all the relevant sources and types of evidence on the topic within the timeframe and with the available resources, before critically appraising the quality of the evidence and analysing it rigorously. Those carrying out the synthesis should acknowledge potential sources of bias and aim to minimise their influence. Many of the principles outlined here help to minimise bias, or to disclose and explain any potential biases that exist. Given the challenges of combining different forms of evidence, independent expert scrutiny is always essential, although its scale and nature will need to be proportionate.


  • Clearly describes the research question, methods, sources of evidence and quality assurance process
  • Communicates complexities and areas of contention
  • Acknowledges assumptions, limitations and uncertainties, including any evidence gaps
  • Declares personal, political and organisational interests and manages any conflicts

Synthesised evidence that is transparent is likely to be more credible, replicable and useful. A clearly described study design should include the search terms used, the databases and other evidence sources considered and when they were accessed, and the criteria that determine which studies are and are not included and why. Such measures make the synthesised evidence more useful in its own right and as a basis for undertaking further synthesis. In addition, explicitly acknowledging complexities, areas of strong consensus and contention – particularly where there are fundamental disagreements within the project team – is essential for a policymaker attempting to interpret the findings, and is important for ensuring well-founded public debate more broadly.

Oxford Martin Restatements – a transparent case-study

Oxford Martin Restatements review the natural science evidence base in areas of current policy concern and controversy. Policy-makers are consulted throughout the evidence synthesis process, from selecting the topic, to defining the question, to reviewing the final report. Evidence to inform a restatement is taken from a thorough review of the full breadth of published peer reviewed literature followed by wide consultation with stakeholders (including academia, industry, non-governmental organisations and government).

Restatements are written so that they are accessible to an informed but non-specialist audience. The exact synthesis methods used and a quality grading of the evidence are clearly presented as part of the restatement. The final restatement is published in a peer reviewed, open access academic journal, and several have been published in the Royal Society’s journals.


  • Is written in plain language
  • Is available in a suitable timeframe
  • Is freely available online 

For synthesised evidence to be both useful and used it must be accessible. To be useful to the policymaker, either the main report or, if necessary, a short summary should be written in plain language by a writer who is experienced in presenting information clearly, concisely and as objectively as possible. To ensure the synthesised evidence is used, it must – of course – be made available in time to contribute to the decision-making process. In all but the most confidential situations, the full text and search terms should be published in an open access repository to allow the synthesised evidence to be extended, reproduced or updated in light of new evidence.

The Campbell Collaboration and Cochrane Libraries – an accessible case-study

The Campbell Collaboration promotes positive social and economic change through the production and use of systematic reviews and other evidence syntheses for policy and practice in education, social welfare, crime and justice, and international development. Cochrane provides a similar service for evidence-based medicine, promoting synthesised evidence to inform specific healthcare decisions

The Campbell and Cochrane Libraries, publish collections of systematic reviews in open access repositories. These repositories help to foster global collaboration and knowledge exchange as well as being a resource for researchers and policymakers worldwide.  

Across both organisations, co-ordinating groups manage the systematic review process; agreeing the title, appraising the proposed methodology and reviewing the final report. All reviews are expected to undergo consultation with external stakeholders, including policymakers, prior to an exact question being defined. The published articles follow a standard format, where methods are transparent, promoting rigour, confidence and ease of use.

The Campbell Collaboration also supports “evidence portals” developed by other organisations. Evidence portals are easy-to-navigate repositories, designed to meet the specific knowledge needs of the target audience. A good example is the Education Endowment Foundation's Teaching and Learning Toolkit which presents both the implementation cost and strength of evidence for a range of education interventions.