Online 2020 | Software

Google Scholar is one of the most commonly used resources by researchers for information retrieval on an every-day basis. Although designed for ‘lookup’ searches (i.e. finding one or more specific records), it may also be a useful additional resource in evidence reviews (including systematic reviews) for academic and grey literature.


Flowcharts in evidence syntheses allow the reader to rapidly understand the core procedures used in a review and examine the attrition of irrelevant records throughout the review process. The PRISMA flow diagram published in 2009 describes the sources, numbers and fates of all identified and screened records in a review. PRISMA is currently in the final stages of a 2020 update, including a new version of the PRISMA flow diagram:

Systematic reviews should be described in a high degree of methodological detail. The ROSES reporting standards call for a high level of reporting detail in systematic reviews and systematic maps. An integral part of the methodological description of a review is a flow diagram/chart.

bibfix is an R package and Shiny app that helps users repair and enrich their bibliographic data. It does so through a suite of functions that request bibliographic data from the OpenAlex API.


In searching for research articles, we often want to obtain lists of references from across studies, and also obtain lists of articles that cite a particular study. In systematic reviews, this supplementary search technique is known as ‘citation chasing’: forward citation chasing looks for all records citing one or more articles of known relevance; backward ciation chasing looks for all records referenced in one or more articles.


Screening articles for evidence synthesis is typically done using titles and abstracts, however, full text screening can be more efficient because there is no guessing involved when assessing if the methods of a study match inclusion criteria. Full text of articles is also necessary for coding article metadata, such as study location or types of data collected, or for analysing article content, such as with topic modelling. doi2txt facilitates full text article screening, coding, and analysis by retrieving plain text versions of journal articles based on a doi, if available,or bibliographic information such as title and authors when a doi is not available or known. It also contains functions for processing full text articles, such as coding articles from an ontology oftopics, or geocoding articles to get actual or approximate latitude and longitude.

greylitsearcher is a web-based tool for performing systematic and transparent searches of organisational websites. You can use the tool to perform structured and transparent searches of websites using Google’s sitesearch functionality, which allows you to search across all pages of a given website.


One of the most important steps in the process of conducting a systematic review or map is data extraction and the production of a database of coding, metadata and study data. There are many ways to structure these data, but to date, no guidelines or standards have been produced for the evidence synthesis community to support their production. On top of this, there is little adoption of easily machine-readable, readily reusable and adaptable databases: these databases would be easier to translate into different formats by review authors, for example for tabulation, visualisation and analysis, and also by readers of the review/map. As a result, it is common for systematic review and map authors to produce bespoke, complex data structures that, although typically provided digitally, require considerable efforts to understand, verify and reuse.
