The Cochrane Handbook recommends that a typical process for selecting studies for inclusion in a review should be as follows:
1. Collate all your search results using reference management software (such as RefWorks or Endnote) and remove any duplicate records.
2. Scan the titles and abstracts of your references, removing any that are obviously irrelevant. Articles and studies should be included or excluded based on the criteria set out in your protocol.
You should consider the following types of question:
Ideally, the papers should be screened by at least two people, with the option to refer to a third where there is no agreement. Students conducting their own review will have to rely exclusively on their own judgment. If you have any doubts about whether to include a study, do not discard it at this stage.
3. Retrieve the full text of the remaining potentially relevant papers.
4. Check for, and then link together, any different papers which are multiple reports of the same study.
5. Check the full text papers to see if the study complies with your eligibility criteria. Keep notes of the studies you have excluded and the reasons why, as you will need to record this information in your PRISMA flow diagram (see our Searching systematically page). Ideally, at least two reviewers should evaluate the methodological quality of the full text papers. Use a checklist (such as CASP) to determine whether the studies meet the criteria outlined in your protocol. You should consider the following types of question:
6. Contact the authors, where necessary, to request more information on their methods or results in order to clarify whether their study meets your eligibility criteria. Record any studies that are unobtainable, or where you cannot obtain all the necessary details, as incomplete. List these in a table of ‘Studies awaiting assessment’ in your review.
7. Finalise which studies will be included in your review. Begin the data collection. It is preferable to have at least two reviewers extracting data. Use a recognised data extraction template, such as CASP, which you can adapt for your own review.
8. Add details of any ongoing trials which have not yet been reported, into an ongoing studies table.
Once you have selected the relevant studies you must evaluate their quality as not all will have a sufficiently rigorous methodology to avoid biased results.
Use the checklists available via the links below to help you with this task. You may need to explore a few before you find one which is best for your needs.
AMSTAR (A MeaSurement Tool to Assess systematic Reviews): AMSTAR is an instrument used in assessing the methodological quality of systematic reviews which has produced a checklist to help with this.
Centre for Evidence-Based Medicine (CEBM): This Critical Appraisal Tools webpage contains useful checklists/worksheet to help with the critical appraisal of different types of medical evidence.
CONSORT (Consolidated Standards of Reporting Trials): The CONSORT Statement comprises a 25-item checklist and a flow diagram. The checklist items focus on reporting how the trial was designed, analyzed, and interpreted; the flow diagram displays the progress of all participants through the trial.
Critical Appraisal Skills Programme (CASP): This set of eight critical appraisal tools are designed to be used when reading research, these include tools for Systematic Reviews, Randomised Controlled Trials, Cohort Studies, Case Control Studies, Economic Evaluations, Diagnostic Studies, Qualitative studies and Clinical Prediction Rule. The checklists can be printed or filled in electronically. The University of Sheffield have produced a video to explain how to use the CASP checklist.
Duke University Medical Centre: The Evidence-Based Practice: Appraise section of this guide includes a variety of critical appraisal worksheets/checklists.
Joanna Briggs Institute (JBI): This Critical Appraisal Tools webpage from the JBI provides a variety of checklists for different types of research/study design.
Mixed Methods Appraisal Tool (MMAT): The MMAT critical appraisal tool is designed for the appraisal stage of reviews that include qualitative, quantitative and mixed methods studies. It facilitates appraisal of the methodological quality of qualitative research, randomized controlled trials, non-randomized studies, quantitative descriptive studies, and mixed methods studies.
PEDRO Scale: The PEDro scale is designed to help people using the PEDro database identify which of the known or suspected randomised clinical trials archived on the PEDro database are likely to be internally valid (criteria 2-9), and could have sufficient statistical information to make their results interpretable (criteria 10-11).
PRISMA: The PRISMA checklist is a 27-part list of the qualities a high-quality paper should contain. Using it you would appraise the quality of a paper's structure, methodology, reporting and more. You can use it on your own paper to check where you need to improve.
Scottish Intercollegiate Guidelines Network (SIGN): SIGN produces a number of checklists. These give a framework of questions to apply to each study to judge the likelihood of bias.
Critical Appraisal Worked Examples
This useful video from the Cochrane Common Mental Disorders on critical appraisal, is the first in the series of seven videos. It discusses the key concepts of critical appraisal. The full series can be found here. In each video, they walk through the CASP checklist alongside an open access research article, to demonstrate how to approach critical appraisal in practice.
This video, created by ScHARR, University of Sheffield MPH by Distance Learning, shows a demonstration of using a CASP checklist on Randomised Controlled Trials.
This video goes through the critical appraisal steps using an example qualitative article and the CASP checklist. It has also been created by ScHARR, University of Sheffield, MPH by Distance Learning.
This PowerPoint from UCL, gives a worked example of critical appraisal of a paper: Session 5: how to critically appraise a paper – an example.
Critical Appraisal Tutorials
The E learning modules by Amanda Burls and Anne Brice, on Finding and Appraising the Evidence, take you through the process of how to find the evidence and then how to assess the validity and reliability of the published research in order to provide effective and efficient healthcare. In each case there is a tutorial and a paper that you can download to appraise and then check your answers against the ones on the site.
The University of Nottingham have also created a tutorial on Critical Appraisal of Clinical Trials in Dermatology, which includes a worked example of appraising a paper. You are asked to read an article on a dermatology clinical trial and then work through some interactive activities.
Critical Appraisal Additional Tools and Resources
There are lots of additional online resources which explain the process of critical appraisal for a variety of resources.
AGREE Enterprise Tools: The Appraisal of Guidelines for Research and Evaluation (AGREE) Instrument evaluates the process of practice guideline development and the quality of reporting. A variety of tools have been developed to assist in the development, reporting and evaluation of practice guidelines and health system guidance.
Claire Beecroft has produced a Prezi on critical appraisal in systematic and narrative reviews which covers: using PICO; checklists; and qualitative studies.
Colorado Community Colleges Online: Useful site for evaluating web resources. The resources look at how to evaluate a website, including details about the CRAP (Currency, Reliability, Authority, Purpose/Point of View) test
Critical Appraisal for Health Students: LibGuide from Teesside University. This guide is aimed at health students who are required to undertake critical appraisal of research. It provides basic level support for appraising qualitative and quantitative research papers. It's designed for students who have already attended lectures on critical appraisal. Frameworks to appraise both types of research are provided along with an opportunity to practise appraisal skills using a research paper with suggested answers.
International Centre for Allied Health Evidence: Includes a video from Idaho State University Library on appraising the literature and a list of critical appraisal tools, linked to the websites where they were developed.
King's College, London have created two videos on critical appraisal:
OT Seeker Tutorial: Tutorial on critical appraisal which includes information on: an introduction to critical appraisal; randomised controlled trials; and systematic reviews.
Oxford Health NHS Trust, Critical Appraisal Tools: Useful webpage containing links to checklists, toolkits and e-learning modules.
Specialist Unit for Review Evidence: The resources from the Specialist Unit for Review Evidence at Cardiff University include useful definitions, manuals, checklists and guides for systematic reviewers.
UCL Great Ormond Street Institute of Child Health: This document on the critical appraisal of a journal article provides a useful summary explaining the process of critical appraisal. Gives a glossary of relevant statistical terms, links and a bibliography.
Understanding Health Research Tool: Tool designed to help people understand and review published health research to decide how dependable and relevant a piece of research is. It gives pointers towards the sort of questions to ask about a paper.
University of Suffolk: The advanced literature search guide for nursing and health sciences has a section dedicated to critical appraisal. There a links to resources and checklists as well as an introductory 10 question quiz to see how much you already know about critical analysis.
During your systematic review, you will probably be working with a large amount of data. You will need to extract data from relevant studies in order to examine and compare results. While the data is being extracted, it is very important to employ good data management practices.
The video from NYU Health Sciences Library, provides some tips for avoiding a data management nightmare!
The aim of a systematic review is to identify relevant studies and to synthesise the data about their study design and results, and to identify any risk of bias. Therefore, the findings will depend on which data from the studies is selected and appraised. The data should be complete and accurate. It would also need to be accessible for available for data sharing and for any future review updates. It is vital that the process used to make selection decisions is transparent and minimises human error and bias.
Once you have screened the entire list of references, you will be left with a core group of studies to be included in your review. The next step is to extract the data from each of the studies in order to synthesize their results. The extraction process should be tracked using a standardized data extraction form. Data can also be coded for computer analysis.
There are several established data extraction forms that you can use as a template, ensuring you adapt for the requirements of your own review. Your form should be in line with your specific PICO framework. Take care to record sufficient detail and consider including a section with details of information absent from primary studies. You may need to contact the authors of specific papers. Ensure you record the potential impact of ‘missing data’, as this should be discussed in your findings.
Data Extraction Forms
There are some template data collection forms below which can be tailored to the specific topic of a review. The completed data collection forms (reflecting the consensus of opinion) should be submitted together with the review.
Data Extraction Tools
The amount and types of data you collect, as well as the number of collaborators who will be extracting it, will dictate which extraction tools are best for your project. Programs like Excel may be the best option for smaller or more straightforward projects, while systematic review software platforms can provide more robust support for larger or more complicated data. A selection of data extraction tools are listed below:
Excel is the most basic tool for the management of the screening and data extraction stages of the systematic review process. Customised workbooks and spreadsheets can be designed for the review process. A more advanced approach to using Excel for this purpose is the PIECES approach, designed by a librarian at Texas A&M. The PIECES workbook is downloadable here.
Covidence is a software platform built specifically for managing each step of a systematic review project, including data extraction. There is lots of support for Covidence available on the Cochrane website.
DistillerSR is a systematic review management software program, similar to Covidence. It guides reviewers in creating project-specific forms, extracting, and analysing data.
NVivo is a software program used for qualitative and mixed-methods research. Specifically, it is used for the analysis of unstructured text, audio, video, and image data, including (but not limited to) interviews, focus groups, surveys, social media, and journal articles. It is produced by QSR International.
Rayyan is a free web based screening tool for systematic reviews. There is a video tour of Rayyan available here: https://www.youtube.com/embed/irAOQgzFMs4 and a LibGuide from McGill which offers lots of support: https://libraryguides.mcgill.ca/rayyan
RevMan is free software used to manage Cochrane reviews. There is lots of support for RevMan on the Cochrane Training site.
SRDR (Systematic Review Data Repository) is a web-based tool for the extraction and management of data for systematic review or meta-analysis. It is also an open and searchable archive of systematic reviews and their data. Access the "Create an Extraction Form" section for more information.
JBI Sumari (the Joanna Briggs Institute System for the United Management, Assessment and Review of Information) is a systematic review software platform geared toward fields such as health, social sciences, and humanities. Among the other steps of a review project, it facilitates data extraction and data synthesis. There are tutorials on the website which cover data extraction and more.
The Systematic Review Toolbox
The SR Toolbox is a community-driven, searchable, web-based catalogue of tools that support the systematic review process across multiple domains. Use the advanced search option to restrict to tools specific to data extraction.
Additional information on data extraction is available from the resources below.
Searches should be reported in enough detail so that they can be reproduced by someone else if necessary. The PRISMA 2020 checklist states for '#7 Search Strategy' that you should "Present the full search strategies for all databases, registers and websites, including any filters and limits used".
CRD (2009) suggest that when describing electronic database searches in a systematic review you should include:
These search strategies normally appear in an appendix or as supplementary material to a published systematic review as they are too long to include in the main part of the systematic review. Some published systematic reviews only include a search strategy optimised for one database whilst others will publish the search strategies for all databases searched.
The search should be described in the methods section of the review and the detailed description of the search can be made available in the appendix. For example, see the Methods section and the Appendices from the Cochrane review Cognitive rehabilitation for attention deficits following stroke.
Some databases, such as those on the EBSCO and OVID platforms, allow you to export your search strategy into a Word document.
There is step by step written guidance and a video to show how you can export all of the information from an EBSCO database into a Word document, in just a few steps, available on the CINAHL LibGuide.
To export your search strategy from the OVID platform, you will need to:
The email which is received will contain a link to the search and a copy of the search history as a table. This can then be copied and pasted into Word. The search link can be shared with anyone who has access to the database.
You can share a search from a database in Ovid (e.g. Embase) using the Email All Search History (see details above) or the Copy Search History Link buttons (underneath the search history):
This will be useful for sharing a search strategy with a colleague so that they can see your search, its results and adapt it (and vice versa).
It will only work if the link is shared with someone who can also access the database (i.e. another member of the University).
The information from the Flow diagram is recorded in the results section of a systematic review.
The information recorded can differ, even in Cochrane Reviews, but the flow diagram can be a useful place to summarise what databases and other resources have been searched and the reasons why full text articles which were assessed have been excluded.
Here are some examples of different PRISMA flowcharts from Cochrane Systematic reviews:
For further information on this topic see the additional resources below.