SciELO - Scientific Electronic Library Online

 
vol.38 número3Anatomía Topográfica del Nervio Laríngeo Superior: Importancia Quirúrgica en las TiroidectomíasPresencia y Biometría de Bazo Accesorio en Individuos Chilenos: Estudio por Tomografía Computarizada índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados

Revista

Articulo

Indicadores

Links relacionados

  • En proceso de indezaciónCitado por Google
  • No hay articulos similaresSimilares en SciELO
  • En proceso de indezaciónSimilares en Google

Compartir


International Journal of Morphology

versión On-line ISSN 0717-9502

Int. J. Morphol. vol.38 no.3 Temuco jun. 2020

http://dx.doi.org/10.4067/S0717-95022020000300774 

Articles

Statements, Recommendations, Proposals, Guidelines, Checklists and Scales Available for Reporting Results in Biomedical Research and Quality of Conduct. A Systematic Review

Declaraciones, Recomendaciones, Propuestas, Directrices, Listas de Verificación y Escalas Disponibles para Informar Resultados y Calidad Metodológica en Investigación Biomédica. Revisión Sistemática

Tamara Otzen1  2 

Carlos Manterola1  2 

Mirian Mora2  3 

Guissella Quiroz2 

Paulina Salazar2 

Nayely García2 

1 Center of Morphological and Surgical Studies (CEMyQ), Universidad de La Frontera, Temuco, Chile.

2 PhD program in Medical Sciences, Universidad de La Frontera, Temuco, Chile.

3 Universidad del Azuay, Ecuador.

SUMMARY:

Research reporting statements, recommendations, proposals, guidelines, checklists and scales can improve quality of reporting results in biomedical research. The aim of this study was to describe statements, recommendations, proposals, guidelines, checklists and scales available for reporting results and quality of conduct in biomedical research. Systematic review. All types statements, recommendations, proposals, guidelines, checklists and scales generated to improve the quality of the biomedical research results report were included. Data sources: EMBASE, HINARI, MEDLINE and Redalyc; in the libraries BIREME-BVS, SciELO and The Cochrane Library; in the meta-searchers Clinical Evidence and TRIP Database; and on the Websites of EQUATOR Network, BMC Medical Education and EUROPE PMC were used. The recovered documents were grouped as study design related to systematic reviews (SR) meta-analysis and meta-reviews, CT and RCTs and quasi-experimental studies, observational studies, diagnostic accuracy studies, clinical practice guidelines; biological material, animal and preclinical studies; qualitative studies; economic evaluation and decision analysis studies; and methodological quality (MQ) scales). The 93 documents were obtained. 19 for SR (QUOROM, MOOSE, AMSTAR, AMSTAR 2, PRISMA, PRISMA-Equity, PRISMA-C, PRISMA-IPD, PRISMA-NMA, PRISMA-RR, PRESS, PRISMA-Search, PRISMA-TCM, PRISMA-ScR, PRISMA-DTA, PRISMA-P, MARQ, GRAPH, ROBIS), 32 for CT and RCTs (CONSORT and it update, STRICTA, RedHot, NPT, CONSORT-PRO, CONSORT-SPI, IMPRINT, TIDieR, CT in orthodontics, "n-de-1", PAFS, KCONSORT, STORK, Protocol health data, SW-CRT, ADs, MAPGRT, PRT, TREND, GNOSIS, ISPOR RCT Report, Newcastle-Ottawa, REFLECT, Ottawa, SPIRIT, SPIRIT-C, SPAC, StaRI, TRIALS, ROBINS-I, ROB 2), 11 for observational studies (STROBE, STREGA, STROBE-nut, INSPIRE, STROME-ID, STROBE-Vet, RECORD, ORION, STNS, MInCir-ODS, GATHER), 10 for diagnostic accuracy studies (STARD and it update, ARDENT, QUADAS, QUADAS-2, QAREL and it update, GRRAS, TRIPOD, APOSTEL), 3 for clinical practice guidelines (AGREE, AGREE II, RIGHT), 10 for biological material, animal and preclinical studies (MIAME, REMARK, SQUIRE, SQUIRE 2.0, REHBaR, ARRIVE, GRIPS, CARE, AQUA, PREPARE), 5 for qualitative studies (COREQ, ENTREQ, GREET and it update, SRQR), and 3 for economic evaluations (NHS-HTA, NICE-STA, CHEERS). There are a great variety of statements, recommendations, proposals, guidelines, checklists with its extensions and scales available. These can be used to improve the quality of the report and the quality of conduct of scientific articles, by authors, reviewers and editors.

KEY WORDS: "Checklist"[Mesh]; "Research Report"[Mesh]; "Research Design"[Mesh]; "Evidence-Based Medicine"[Mesh]

RESUMEN:

El uso de recomendaciones, propuestas, listas de verificación y escalas pueden mejorar la calidad del informe de resultados en investigación biomédica. El objetivo de este estudio fue describir las declaraciones, recomendaciones, propuestas, directrices, listas de verificación y escalas disponibles para informar resultados y calidad metodológica en investigación biomédica. Revisión sistemática. Se incluyeron todas las tipos de declaraciones, recomendaciones, propuestas, pautas, listas de verificación y escalas disponibles para informar resultados y calidad metodológica en investigación biomédica. Fuentes de datos: EMBASE, HINARI, MEDLINE y Redalyc; bibliotecas BIREME-BVS, SciELO y The Cochrane Library; metabuscadores Clinical Evidence y TRIP Database; sitios Web EQUATOR Network, BMC Medical Education y EUROPE PMC. Los documentos recuperados se agruparon por tipo de diseño de estudio: revisiones sistemáticas (RS), ensayos clínicos (EC), estudios cuasi experimentales, observacionales, de precisión diagnóstica, guías de práctica clínica (GPC); de material biológico, estudios animales y preclínicos; estudios cualitativos; estudios de evaluación económica y estudios de análisis de decisiones; y escalas de calidad metodológica (CM). se obtuvieron 93 documentos. 19 para RS (QUOROM, MOOSE, AMSTAR, AMSTAR 2, PRISMA, PRISMA-Equity, PRISMA-C, PRISMA-IPD, PRISMA-NMA, PRISMA-RR, PRESS, PRISMA-Search, PRISMA-TCM, PRISMAScR, PRISMA-DTA, PRISMA-P, MARQ, GRAPH, ROBIS), 32 para EC (CONSORT y su actualización, STRICTA, RedHot, NPT, CONSORT-PRO, CONSORT-SPI, IMPRINT, TIDieR, CT en ortodoncia, "n-de-1 ", PAFS, KCONSORT, STORK, datos de salud del protocolo, SW-CRT, ADs, MAPGRT, PRT, TREND, GNOSIS, ISPOR RCT Report, Newcastle-Ottawa, REFLECT, Ottawa, SPIRIT, SPIRIT-C, SPAC, StaRI , PRUEBAS, ROBINS-I, ROB 2), 11 para estudios observacionales (STROBE, STREGA, STROBE-nut, INSPIRE, STROME-ID, STROBE-Vet, RECORD, ORION, STNS, MInCir-ODS, GATHER), 10 para estudios de precisión diagnóstica (STARD y su update, ARDENT, QUADAS, QUADAS-2, QAREL y su update, GRRAS, TRIPOD, APOSTEL), 3 para GPC (AGREE, AGREE II, RIGHT), 10 para material biológico, animal y estudios preclínicos (MIAME, REMARK, SQUIRE, SQUIRE 2.0, REHBaR, ARRIVE, GRIPS, CARE, AQUA, PREPARE), 5 para estudios cualitativos (COREQ, ENTREQ, GREET y su update, SRQR), y 3 para evaluaciones económicas (NHS-HTA, NICE-STA, CHEERS). Existe una gran variedad de instrumentos disponibles. Estos pueden ser utilizados por autores, revisores y editores; para mejorar la calidad del informe y de la CM de artículos científicos.

PALABRAS CLAVES: "Checklist"[Mesh]; "Research Report"[Mesh]; "Research Design"[Mesh]; "Evidence-Based Medicine"[Mesh]

BACKGROUND

Daily clinical decisions are usually based on personal experience and evidence available from scientific studies. It is therefore, imperative that publications provide not only precise information regarding the methodology used and the results obtained; published articles should also be structured to facilitate their reading comprehension.

The first experience with this model was the CONSORT statement, published in 1996 (Begg et al., 1996), revised in 2001 (Moher et al., 2001) and 2004 (Campbell et al., 2004); and updated in 2010 (No authors listed, 2010); the objective in this model was to improve the quality of clinical trials (CT) and randomized clinical trial (RCT) report; becoming the example to follow, subsequently encouraging the motivation of various research groups to generate proposals aimed at improving the reporting of research results.

There are a number of reasons that recommendations, guidelines, checklists and scales for authors are needed. To begin with, authors are faced with the responsibility of persuading reviewers and scientific journal editors of the quality of their study. Undoubtedly in this process an adequate investigation is critical, though a proper report of the objectives, design, eligibility criteria, sample size and type of sampling among others, is no less important. These are some examples of information that will allow a reader to critically evaluate the study. Giving insufficiently information could be confusing the reader but giving too much information could overstate a vain problem.

On the other hand, there are some instruments aimed at evaluating quality methodological quality (MQ) or risk of bias of published articles.

The aim of our study was to describe statements, recommendations, proposals, guidelines, checklists and scales available for reporting results and quality of conduct in biomedical research.

MATERIAL AND METHOD

This manuscript was written following the PRISMA statement (Moher et al., 2009).

Study Design. Systematic review (SR).

Eligibility criteria. All types of statements, recommendations, proposals, guidelines, checklists and scales to improve the quality of biomedical research results reporting, as of 1996 were included. No language restriction was considered. Exclusion criteria were not considered.

Data Source. A search was made in the databases EMBASE, HINARI, MEDLINE and Redalyc; in the libraries BIREMEBVS, SciELO and The Cochrane Library; in the metasearchers Clinical Evidence and TRIP Database; and on the Websites of EQUATOR Network, BMC Medical Education and EUROPE PMC. The closing date was August 30, 2019.

Search. Sensitive search strategies were carried out in the available literature, without restriction of the year, language or state of the publication (published, unpublished, in process of publication). For this, MeSH or DeCS terms were used, free terms, Boolean operators AND/OR, truncation and limits. Full electronic search strategy for each data source are summarized in Table I.

Table I Search strategies according to the source of information used. 

Publication selection. The process for selecting studies included identification, screening, eligibility, and final inclusion of primary studies in the SR.

Data collection process. The review of the articles was carried out in three stages, first the titles were reviewed, then the summaries and subsequently the complete texts. This process was carried out by two groups of three researchers each (Group 1: CM, GQ and PS. Group 2: TO, MM and NG). Disagrees were resolved by consensus between the two review groups. Articles that initially coincided with the inclusion criteria were selected for extensive review of the texts.

Data items. A data extraction form was used that included information on the name, year of publication, number of items, assigned score domains, versions, objective, type of study design and observations. The recovered documents were grouped by groups of study designs (systematic reviews (SR) meta-analysis and meta-reviews, CT and RCTs and quasi-experimental studies, observational studies, diagnostic accuracy studies, clinical practice guidelines; biological material, animal and preclinical studies; qualitative studies; economic evaluation and decision analysis studies; and MQ scales).

Summary measures. No statistical tools were used, because it is a qualitative SR.

Ethics. Names of authors and centers were masked.

RESULTS

The search made it possible to retrieve 1233 documents, 189 of which were duplicated between the sources selected. After reviewing the requirements, 93 documents were achieved that make up the population under study, as could be seen in the flow diagram (Fig. 1). These are described below and in further detail in Table II.

Fig. 1 PRISMA flowchart of participant studies. 

Table II Summary of information collected by type of study designs. 

NR: Not Reported.

Systematic reviews, meta-analysis and meta-reviews. A total of 7 checklists, 11 extensions, and 1 update were obtained (n=19).

1. QUOROM Statement. Its objective was to create a tool for the reporting of SR results based on CT. Composed of 6 domains (title/summary, introduction, methods, results and meta-analysis discussion) and 18 items, which include a flow diagram (Moher et al., 1999).

2. MOOSE Proposal. Its objective was to develop an instrument with recommendations for the meta-analysis of observational studies. Composed of 35 items, grouped into 6 domains (Background, Search strategy, Methods, Results, Discussion, Conclusions) (Stroup et al., 2000).

3. AMSTAR Statement. It is a measurement tool to assess the methodological quality of SR that include 11 items (Shea et al., 2007). In 2017, the version AMSTAR 2 for SR that include randomized or non-randomized studies of healthcare interventions was published, including 16 items, with simpler response categories than the original AMSTAR (Shea et al., 2017).

4. PRISMA Statement. It is the QUOROM update (Stroup et al.). Its objective was to resolve conceptual and practical advances of SRs. Composed of 27 items, grouped into 7 domains (title/summary, introduction, methods, results, discussion and financing) (Moher et al., 2009). It comprises a series of extensions, including: PRISMAEquity, published in 2012 (Welch et al., 2012) and updated in 2015 as PRISMA-E 2012, for SR and meta-analyses with a focus on health equity, defined as the absence of avoidable and unfair inequalities in health (Welch et al., 2016); PRISMA-C, published in 2014, as protocols for SR and meta-analyses of RCT or observational studies of newborn and child health research (Kapadia et al., 2016); PRISMA-IPD, an extension for SR and Meta-Analyses of individual participant data, published in 2015 (Stewart et al., 2015); PRISMA-NMA, an extension statement for SR incorporating network meta-analyses of health care interventions, published in 2015 (Hutton et al., 2015); PRISMA-RR, for report of rapid reviews, including those with analogous terminology (e.g. rapid evidence synthesis, rapid knowledge synthesis), published in 2015 (Stevens, 2015); PRESS, published in 2008-2010 an updated in 2016, as a guide to improve the peer review of electronic literature search strategies (McGowan et al., 2016); PRISMA-Search, for report literature searches in SR, published in 2016 (Rethlefsen et al., 2016); PRISMA-TCM, for report SR and meta-analyses of studies that evaluate chinese herb medicine or moxibustion, published in 2016 (Bian et al., 2016); PRISMA-ScR, for report SR and MetaAnalysis for scoping reviews, used to map the concepts underpinning a research area and the main sources and types of evidence available; was published in 2018 (Tricco et al., 2018); PRISMA-DTA, reported in 2015, for reports of SR and meta-analyses of diagnostic test accuracy studies (McInnes et al., 2018), and PRISMA-P, constituted by 17 items and 26 sub-items, published in 2015, with the objective of prepare SR protocols that summarize aggregate data from studies, especially evaluations of intervention effects (Moher et al., 2015).

5. MARQ Checklist. Its objective was to develop an instrument that evaluated the methodological quality of meta-reviews, to promote a transparent and consistent reporting of metareview methodology. It consists of 20 items grouped in 7 domains (Singh et al., 2012).

6. GRAPH Recommendations. Its aim was to design and report heart rate variability studies in psychiatry and which will expand the ability to perform meta-analyses and metaresearch in this area. It consists of 13 items distributed in 4 domains (Stevens et al., 2016).

7. ROBIS tool. For assessing the risk of bias in SR. Was aimed at 4 broad categories of reviews mainly within health care settings: interventions, diagnosis, prognosis, and etiology. It is compound by 5 domains and 24 items presented as questions (Whiting et al., 2016).

CTs, RCTs and quasi-experimental studies. A total of 12 checklists or statements, 17 extensions, 2 updates, and 1 protocol were obtained (n=32).

1. CONSORT Statements. Published in 1996 (Begg et al.) and updated in 2010 (No authors listed, 2010). Its objective was to improve the quality of the clinical trial report. Composed of 22 items grouped into 5 domains (title/ summary, introduction, methods, results and discussion). It includes a series of extensions and supplements, among which: STRICTA, published in 2001, with the objective of It aim was create a checklist for reporting RCT in acupuncture, with 6 items, applicable together with CONSORT statement (MacPherson et al., 2001); RedHot, whose objective was to create an instrument for reporting homeopathic treatments (Dean et al., 2007); NPT List, published in 2005 (Boutron et al., 2005) and updated in 2017. Its objective was to evaluate the quality of nonpharmacological treatment CTs. It consists of 10 items and 5 sub-elements, which are evaluated as: Yes, No, Not clear (Boutron et al., 2017). CONSORT-PRO, whose objective was to determine the results reported by the patients (PRO), which are usually inadequately reported, thus limiting the value of the data (Calvert et al., 2013); CONSORT-SPI, published in 2013 (Montgomery et al., 2013), and updated in 2018, for reporting randomized clinical trials (RCTs) of social and psychological interventions, extends 9 of the 25 items from CONSORT 2010 (CONSORT 2010), added a new item related to stakeholder involvement, and modified aspects of the flow diagram (Montgomery et al., 2018); IMPRINT, which seeks to improve CT information of infertility treatments (Harbin Consensus Conference Workshop Group et al., 2014); TIDIER checklist, for the report of interventions in evaluative studies, including CT (Hoffmann et al., 2014); adaptation to CT in orthodontics (Pandis et al., 2015); the "n-de-1", to evaluate the effectiveness of an intervention in a single patient (Vohra et al., 2016); PAFS, for the report of randomized pilot and feasibility trials, added 11 items grouped in 7 domains (Eldridge et al., 2016); KCONSORT (2009) renamed STORK standards (2016), to generate a standard for reporting results in intervention studies where they were going to be used Kampo Products (Motoo et al., 2017); protocol for a scoping review to support development of a CONSORT extension for RCTs using cohorts and routinely collected health data, published in 2018 (Kwakkenbos et al., 2018); SW-CRT, published in 2018, for reporting of stepped wedge cluster RCT consist in 40 items grouped in 26 domains (Hemming et al., 2018); ADs, published in 2018, extension for adaptive design RCT, adjusting 24 items of 16 domains of the CONSORT 2010 (Dimairo et al., 2018); MAPGRT for reporting of Multi-Arm Parallel-Group RCT, expanding on 10 items of the CONSORT 2010 (Juszczak et al., 2019); PRT for reporting within person RCT, it extends 16 items of the CONSORT 2010 checklist and introduces a modified flowchart and baseline table (Pandis et al., 2019). None of them considers score allocation.

2. TREND Statement. Its objective was to generate a tool for CT analysis when it was not possible to perform random assignment. This was composed of 21 items, grouped into 5 domains (Des Jarlais et al., 2004).

3. GNOSIS Guide. Its objective was to standardize the neuro-oncology CT report of phase 1 and 2. It consists of 7 domains and 18 items (Chang et al., 2005).

4. ISPOR RCT Report. Published in 2005 (Ramsey et al., 2005) and updated in 2015. Its objective was to serve as an orientation guide for the design, implementation and presentation of cost-effectiveness analysis reports in the CT. It has 5 domains (design, information elements, database, analysis and report of results), which group 26 items. It does not contain a numerical rating scale (Ramsey et al., 2015).

5. Newcastle-Ottawa Scale (NOS). Its objective was to assess the quality of non-randomized trials in metaanalyses. Its evaluation is currently in progress (Stang et al., 2010).

6. REFLECT Statement. Its objective was to improve the CT report related to "livestock and food safety". Composed of 5 domains and 22 items that include a flow diagram of the participants (Sargeant et al., 2010).

7. Ottawa Declaration. Its objective was to provide guidelines for the ethics of design and CT control by conglomerates. It is composed of 7 domains (design, review by ethics committee, participants, informed consent, access controller, risk-benefit assessment, and protection of participant vulnerability) (Weijer et al., 2012).

9. SPIRIT Statement. Its objective was to improve the quality of CT protocols. It consists of 33 items grouped into 5 domains (administrative information, introduction, methods, ethics and dissemination, and appendices) (Chan et al., 2013). It have one developed extension: SPIRIT-C, for trials in Child Health, with 11 domians (ClyburneSherin et al., 2015).

10. SPAC Therapy Checklist. Its objective was to develop a checklist for trials with alternative therapeutic interventions. It consists of 19 items that are answered with a Likert scale with scores of 1 (in disagreement), up to 9 (in agreement) (Kamioka et al., 2013).

11. StaRI Statement and Checklist. Its aim was to create a statement for reporting implementation studies. Consists of 27 items grouped in 9 domains (Pinnock et al., 2017).

12.TRIALS Guidelines. Its objective was to generate a checklist for reporting embedded recruitment trials. It consists of 36 items grouped into 25 domains (Madurasinghe et al., 2016).

13. ROBINS-I Tool. It is the preferred tool to be used in Cochrane Reviews for non-randomized studies of interventions, currently available for cohort designs with adaptions underway for other study types such as case control and interrupted time series. ROBINS-I overlap with RoB 2, the ‘Risk of bias’ 2.0 tool but include 3 additional domains: confounding, selection of participants into the study and classification at intervention (solid domain in clinical epidemiology are needed to use it) (Sterne et al., 2016).

Observational studies. A total of 5 checklist or statements and 6 extension (n=11).

1. STROBE Statement. Its objective was to develop a checklist for the reporting of research results made with cohort studies, cases and controls; and of cross section. It consists of 6 domains (title/summary, introduction, methodology, results, discussion and others), and 22 items (von Elm et al., 2007). Different versions are provided according to the design. It has an extension called STREGA, published in 2009 (Little et al., 2009), whose objective was to provide own items of studies of genetic association (genotyping, the model of the haplotype, fundamentals for the selection of genes, etc.). Other extensions are: STROBE-nut: published in 2016, as a list of recommendations for reporting nutritional epidemiology and dietary assessment research (24 recommendations for nutritional epidemiology grouped in 6 domains, were added to the STROBE checklist) (Lachat et al., 2016). INSPIRE Guideline: Published in 2016 (Cheng et al., 2016), extension of the STROBE statements and the CONSORT Standards; for writing guidelines to improve the quality of reporting for simulation-based research. STROME-ID statement: Published in 2014 (Field et al., 2014), for support scientific reporting of molecular epidemiological studies to inspire authors to consider specific threats to valid inference (20 items were added to the 22 item of the STROBE checklist). STROBE-Vet statement: Published in 2016 (Sargeant et al., 2016), for reporting requirements for observational studies in veterinary medicine related to health, production, welfare, and food safety. Modifications or additions were made to 16 items of STROBE statements (only in 6 items of it, no modifications were applied). RECORD, to help researchers who use health data collected routinely (for research in clinical epidemiology), to comply with ethical obligations of complete and accurate reports. It consists of 13 items that complement or modify the items of STROBE (Nicholls et al., 2016).

2. ORION Statement. Its objective was to raise the level of research and publication in hospital epidemiology related to nosocomial infections. Composed of 22 items, grouped into 5 domains (title/summary, introduction, methods, results and discussion), and a summary table (Stone et al., 2007).

3. STNS Score. Its objective was to generate a proposal to evaluate the quality of reports of surgical interventions in the treatment of trigeminal neuralgia. Was partially based on STROBE. It consists of 30 items grouped into 3 domains; and assigns points to their items (0 to 30 points) (Akram et al., 2013).

4. MInCir-ODS Initiative. Published in 2013 and updated in 2017 (Manterola & Otzen, 2017; Manterola et al., 2018). Its objective was to build a checklist for the report of results with observational descriptive studies. Composed of 19 items, grouped into 4 domains: Introduction, methodology, results and discussion.

5. GATHER Statement. Created with the objective of define and promote good practice in reporting of global health estimates (decision makers and researchers). It comprised 18 items grouped in 6 domains (Stevens et al., 2016).

Diagnostic accuracy studies. Six checklists or proposals, 1 extension and 3 updates were retrieved (n=10).

1. STARD Guidelines. Published in 2003 (Bossuyt et al., 2003) and updated in 2015 (Bossuyt et al., 2015). Its objective was to generate a standard for the report of studies of diagnostic accuracy. Composed of 30 items grouped in 6 domains (title/summary, introduction, methods, results and discussion), a flow diagram and score assignment. In 2015, an extension named ARDENT checklist was created to establish tools for standardized design and reporting of diagnostic accuracy studies of liver fibrosis tests. It consists of 27 items grouped in 5 domains (Boursier et al., 2015).

2. QUADAS Tool. Published in 2003 (Whiting et al., 2003), updated in 2011 (QUADAS-2) (Whiting et al., 2011). Its objective was to generate a tool for quality assessment of diagnostic precision studies included in an SR. Based on original QUADAS and evidence on sources of bias and variation of studies of diagnostic accuracy. It is applied in 4 phases: summary of the question, adaptation to the study being analyzed, flow chart for the primary studies; and assessment of the risk of bias and applicability.

3. QAREL Tool. Published in 2010 (Lucas et al., 2010), updated in 2013 (Lucas et al., 2013). Its objective was to develop a reliability assessment tool for diagnostic test studies, which could also be used in SR diagnostic tests. Composed of 7 domains (spectrum of subjects, examiners, masking of theexaminer, interval between measurements, application and interpretation of the test, order of the examination and analysis of the data) and 11 items. It is applied based on questions of 3 answer alternatives "yes" (good quality), "no" (poor quality), "not clear"; and some articles include the option '' not applicable ".

4. GRRAS Guidelines. Its objective was to develop a tool that would cover the information regarding reliability and agreement in measurements, especially in healthcare. Composed by 15 items grouped in 6 domains (Kottner et al., 2011).

5. TRIPOD Statement. Its objective was to improve reporting transparency of a prediction model study for individual prognosis or diagnosis, regardless of the study methods used. It consists of 6 dimensions and 22 items (Collins et al., 2015).

6. APOSTEL Recommendations. Its objective was to develop consensus recommendations for the presentation of results of optical quantitative tomography studies. It consists of 9 items (Cruz-Herranz et al., 2016).

Clinical practice guidelines. Two checklists and 1 update were retrieved (n=3).

1. AGREE Instrument. Published in 2003 (AGREE Collaboration, 2003), and updated in 2010 as AGREE-II (Brouwers et al., 2010). Its objective was to advance in the development, presentation of reports and evaluation of guidelines in health care through the generation of clinical practice guidelines. It consists of 23 items grouped into 6 domains (Scope and Objective, Participation of stakeholders, Rigor of preparation, Clarity of presentation, Applicability, and Editorial Independence).

2. RIGHT Statement. It objective was to generate an instrument for reporting Practice Guidelines in Health Care. Consist in 28 items grouped in 5 domains (Chen et al., 2017).

Biological material, animal and preclinical studies. Nine guidelines and proposals, and 1 update were retrieved (n=10).

1. MIAME Guidelines. Its objective was to establish a standard to register and report gene expression database on microarrays, thus facilitating the establishment of databases and allowing the development of data analysis tools. Composed by 6 domains (experimental design, matrix, samples, hybridization, measurement, and normalization of controls) (Brazma et al., 2001).

2. REMARK Guideline. Its objective was to generate recommendations for the publication of studies on tumor markers for prognostic models. Composed of 20 items grouped in 4 domains. Contemplate punctuation when applying the instrument; its maximum is 20 points (McShane et al., 2005).

3. SQUIRE Guidelines. Published in 2008 and updated as SQUIRE 2.0 in 2016. Its objective was to improve the biomedical scientific information reports. Composed of 19 items, grouped into 6 domains (title/summary, introduction, method, results, discussion and others) (Ogrinc et al., 2016).

4. REHBaR Proposal. Its objective was to develop a list of criteria to improve the quality of reporting results in homeopathy basic research. Composed of 23 items, grouped into 4 domains (Stock-Schröer et al., 2009).

5. ARRIVE Guidelines. Its objective was to maximize the published information and minimize unnecessary studies in animals. Composed of 20 items grouped into 5 domains (Kilkenny et al., 2010).

6. GRIPS Statement. Its objective was to improve the quality of the report of genetic risk prediction studies. Composed of 25 items, grouped into 6 domains. For each item, the specific type of information is described, as well as the minimum content that must be reported (Janssens et al., 2011).

7. CARE Guidelines. Its objective was to implement a guide for the reporting of data analysis in case report. It consists of 13 items (Gagnier et al., 2013).

8. AQUA checklist. Developed for reporting original anatomical studies. Consisted of 29 items divided into 8 domains (Tomaszewski et al., 2017).

9. PREPARE Guidelines. Its objective was to reinforce the planning stage of animal experiments. It consists of three domains: formulation; dialogue between scientists and animal facilities; and quality control of the study components (Smith et al., 2018).

Qualitative studies. Four checklist or statements and one update were recovered (n=5).

1. COREQ Checklist. Its objective was to prepare a checklist for the report of the results of qualitative studies (interviews and focus groups). Composed of 3 domains (research and reflexivity team, design, and analysis of data and reports) and 32 items (Tong et al., 2007).

2. ENTREQ Statement. Its objective was to help researchers inform the stages associated with the synthesis of qualitative health research: search and selection of qualitative research, quality assessment and methods to synthesize qualitative findings. It consists of 5 domains and 21 items (Tong et al., 2012).

3. GREET Statement. Published 2013 (Phillips et al., 2013), updated in 2016 (Phillips et al., 2016). Its objective was to provide guidance for the reporting of educational interventions for evidence-based practice. It consists of 17 items grouped into 6 domains (descriptive, participants, intervention, content, evaluation and confusion), assigning 3 response possibilities (fully informed, partially informed, not informed).

4. SRQR Recommendations. Its objective was to improve the transparency of all aspects of qualitative research. It consists of 5 dimensions and 21 items (O`Brien et al., 2014).

Economic evaluation and decision analysis studies. Three documents were recovered (n=3).

1. NHS-HTA Recommendations. Its objective was to develop recommendations to increase the generalization of economic evaluations. It consists of: recommendations to report results of economic evaluations of CT (composed of 8 items); a checklist for evaluation of the generalization of CT-based studies (composed of 10 items); and other, for the evaluation of the generalization of modeling studies (composed of 7 items) (Drummond et al., 2005).

2. NICE-STA Report. Its objective was to provide a checklist to evaluate the quality of economic health reports, especially STA decision analysis models, incorporating elements for economic evaluation. Composed of 46 items, grouped into 7 domains (relevance to current technology, structure, clinical evidence, data utility, use of resources and cost data, uncertainty assessment and consistency); with 4 response options (yes, no, it does not appear and not clear) and comments (Zimovetz & Wolowacz, 2009).

3. CHEERS Statement. Its objective was to develop recommendations to facilitate the reporting of economic evaluation publications. It consists of 24 items grouped into 6 domains (title / summary, introduction, methods, results, discussion and others) (Husereau et al., 2013).

Finally, it can state that almost 64 guidelines, proposals and checklist are in develop process or in protocol phase (15 CT and CONSORT extensions, 12 observational studies and STROBE extensions, 10 SR and PRISMA extensions, 2 CT protocols and SPIRIT extensions; and 25 other study designs and clinical areas) (EQUATOR).

DISCUSSION

As a summary of the evidence, we think that there is an important number and a variety of checklists available for the reporting of results in biomedical research, which can be used by authors, reviewers and editors, all aimed to improve the quality of the report of scientific articles. These could be interesting and relevant to researchers, which need to know the various options for reporting their results according to the type of study.

The publication of the documents described above (Table II), underscores the current trend oriented toward adequate reporting of results in biomedical research, regardless of the type of designs used. Whether through the use of checklists, check-ups or verification, these are all instruments that include criteria to evaluate certain characteristics that represent the minimum quality features required for a manuscript.

As possible limitations of the study, it seems to us that, as it may occur in any SR, we think that this study could have risk of publication and reporting bias, as well as incomplete retrieval of identified research. For example, we know that there are at least 50 proposals and checklist in develop process or in protocol phase, only in Equator (Equator Network, 2020). And perhaps others we could not found in other data sources.

However, it is important to point out that checklists were not designed to assess MQ, only the compliance with some parameters; for the MQ construct (a concept that allows assessment of the different aspects of an article, such as type of design, population, methodology, report quality etc.), is evaluated with ad-hoc scales such as some of those previously mentioned, that could also be used as checklists.

As a conclusion, we can point out that there is an important number and a variety of checklists available for the reporting of results in biomedical research, which can be used by authors, reviewers and editors, all aimed to improve the quality of the report of scientific articles.

REFERENCES

AGREE Collaboration. Development and validation of an international appraisal instrument for assessing the quality of clinical practice guidelines: the AGREE project. Qual. Saf. Health Care, 12(1):18-23, 2003. [ Links ]

Akram, H.; Mirza, B.; Kitchen, N. & Zakrzewska, J. M. Proposal for evaluating the quality of reports of surgical interventions in the treatment of trigeminal neuralgia: the Surgical Trigeminal Neuralgia Score. Neurosurg. Focus, 35(3):E3, 2013. [ Links ]

Begg, C.; Cho, M.; Eastwood, S.; Horton, R.; Moher, D.; Olkin, I.; Pitkin, R.; Rennie, D.; Schulz, K. F.; Simel, D.; et al. Improving the quality of reporting of randomized controlled trials. The CONSORT statement. JAMA, 276(8):637-9, 1996. [ Links ]

Bian, Z. X. Preferred Reporting Items for Systematic Review and MetaAnalyses of traditional Chinese medicine: the PRISMA-TCM Statement. In: Equator Network. Web Site. Oxford, The EQUATOR Network and UK EQUATOR Centre, Centre for Statistics in Medicine, Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences (NDORMS), University of Oxford, 2016. Available from: http://www.equator-network.org/library/reporting-guidelines-underdevelopment/#65Links ]

Bossuyt, P. M.; Reitsma, J. B.; Bruns, D. E.; Gatsonis, C. A.; Glasziou, P. P.; Irwig, L.; Lijmer, J. G.; Moher, D.; Rennie, D.; de Vet, H. C.; et al. STARD 2015: an updated list of essential items for reporting diagnostic accuracy studies. Radiology, 277(3):826-32, 2015. [ Links ]

Bossuyt, P. M.; Reitsma, J. B.; Bruns, D. E.; Gatsonis, C. A.; Glasziou, P. P.; Irwig, L. M.; Lijmer, J. G.; Moher, D.; Rennie, D.; de Vet, H. C.; et al. Towards complete and accurate reporting of studies of diagnostic accuracy: the STARD initiative. Standards for Reporting of Diagnostic Accuracy. Clin. Chem., 49(1):1-6, 2003. [ Links ]

Boursier, J.; de Ledinghen, V.; Poynard, T.; Guéchot, J.; Carrat, F.; Leroy, V.; Wong, G. L.; Friedrich-Rust, M.; Fraquelli, M.; Plebani, M.; et al. An extension of STARD statements for reporting diagnostic accuracy studies on liver fibrosis tests: the Liver-FibroSTARD standards. J. Hepatol., 62(4):807-15, 2015. [ Links ]

Boutron, I.; Altman, D. G.; Moher, D.; Schulz, K. F.; Ravaud, P. & CONSORT NPT Group. CONSORT Statement for Randomized Trials of Nonpharmacologic Treatments: A 2017 Update and a CONSORT Extension for Nonpharmacologic Trial Abstracts. Ann. Intern. Med., 167(1):40-7, 2017. [ Links ]

Boutron, I.; Moher, D.; Tugwell, P.; Giraudeau, B.; Poiraudeau, S.; Nizard, R. & Ravaud, P. A checklist to evaluate a report of a nonpharmacological trial (CLEAR NPT) was developed using consensus. J. Clin. Epidemiol., 58(12):1233-40, 2005. [ Links ]

Brazma, A.; Hingamp, P.; Quackenbush, J.; Sherlock, G.; Spellman, P.; Stoeckert, C.; Aach, J.; Ansorge, W.; Ball, C. A.; Causton, H. C.; et al. Minimum information about a microarray experiment (MIAME) toward standards for microarray data. Nat. Genet., 29(4):365-71, 2001. [ Links ]

Brouwers, M. C.; Kho, M. E.; Browman, G. P.; Burgers, J. S.; Cluzeau, F.; Feder, G.; Fervers, B.; Graham, I. D.; Grimshaw, J.; Hanna, S. E.; et al. AGREE II: advancing guideline development, reporting and evaluation in health care. C. M. A. J., 182(18):E839-42, 2010. [ Links ]

Calvert, M.; Blazeby, J.; Altman, D. G.; Revicki, D. A.; Moher, D.; Brundage, M. D. & CONSORT PRO Group. Reporting of patientreported outcomes in randomized trials: the CONSORT PRO extension. JAMA, 309(8):814-22, 2013. [ Links ]

Campbell, M. K.; Elbourne, D. R.; Altman, D. G. & CONSORT group. CONSORT statement: extension to cluster randomised trials. BMJ, 328(7441):702-8, 2004. [ Links ]

Chan, A. W.; Tetzlaff, J. M.; Altman, D. G.; Laupacis, A.; Gøtzsche, P. C.; Krleza-Jeric, K.; Hróbjartsson, A.; Mann, H.; Dickersin, K.; Berlin, J. A.; et al. SPIRIT 2013 statement: defining standard protocol items for clinical trials. Ann. Intern. Med., 158(3):200-7, 2013. [ Links ]

Chang, S. M.; Reynolds, S. L.; Butowski, N.; Lamborn, K. R.; Buckner, J. C.; Kaplan, R. S. & Bigner, D. D. GNOSIS: guidelines for neurooncology: standards for investigational studies-reporting of phase 1 and phase 2 clinical trials. Neuro. Oncol., 7(4):425-34, 2005. [ Links ]

Chen, Y.; Yang, K.; Marusic, A.; Qaseem, A.; Meerpohl, J. J.; Flottorp, S.; Akl, E. A.; Schünemann, H. J.; Chan, E. S.; Falck-Ytter, Y.; et al. A Reporting Tool for Practice Guidelines in Health Care: The RIGHT Statement. Ann. Intern. Med., 166(2):128-32, 2017. [ Links ]

Cheng, A.; Kessler, D.; Mackinnon, R.; Chang, T. P.; Nadkarni, V. M.; Hunt, E. A.; Duval-Arnould, J.; Lin, Y.; Cook, D. A.; Pusic, M.; et al.; Reporting Guidelines for Health Care Simulation Research: Extensions to the CONSORT and STROBE Statements. Simul. Healthc., 11(4):23848, 2016. [ Links ]

Clyburne-Sherin, A. V.; Thurairajah, P.; Kapadia, M. Z.; Sampson, M.; Chan, W. W. & Offringa, M. Recommendations and evidence for reporting items in pediatric clinical trial protocols and reports: two systematic reviews. Trials, 16:417, 2015. [ Links ]

Collins, G. S.; Reitsma, J. B.; Altman, D. G. & Moons, K. G. Transparent reporting of a multivariable prediction model for individual prognosis or diagnosis (TRIPOD): the TRIPOD statement. BMJ, 350:g7594, 2015. [ Links ]

Cruz-Herranz, A.; Balk, L. J.; Oberwahrenbrock, T.; Saidha, S.; MartinezLapiscina, E. H.; Lagreze, W. A.; Schuman, J. S.; Villoslada, P.; Calabresi, P.; Balcer, L.; et al. The APOSTEL recommendations for reporting quantitative optical coherence tomography studies. Neurology, 86(24):2303-9, 2016. [ Links ]

Dean, M. E.; Coulter, M. K.; Fisher, P.; Jobst, K. A. & Walach, H. Reporting data on homeopathic treatments (RedHot): a supplement to CONSORT. J. Altern. Complement. Med., 13(1):19-23, 2007. [ Links ]

Des Jarlais, D. C.; Lyles, C.; Crepaz, N. & TREND Group. Improving the reporting quality of nonrandomized evaluations of behavioral and public health interventions: the TREND statement. Am. J. Public Health, 94(3):361-6, 2004. [ Links ]

Dimairo, M.; Coates, E.; Pallmann, P.; Todd, S.; Julious, S. A.; Jaki, T.; Wason, J.; Mander, A. P.; Weir, C. J.; Koenig, F.; et al. Development process of a consensus-driven CONSORT extension for randomised trials using an adaptive design. BMC Med., 16(1):210, 2018. [ Links ]

Drummond, M.; Manca, A. & Sculpher, M. Increasing the generalizability of economic evaluations: recommendations for the design, analysis, and reporting of studies. Int. J. Technol. Assess. Health Care, 21(2):16571, 2005. [ Links ]

Eldridge, S. M.; Chan, C. L.; Campbell, M. J.; Bond, C. M.; Hopewell, S.; Thabane, L.; Lancaster, G. A. & PAFS consensus group. CONSORT 2010 statement: extension to randomised pilot and feasibility trials. BMJ, 355:i5239, 2016. [ Links ]

Equator Network. Enhancing the QUAlity and Transparency of Health Research, 2020. Available from: http://www.equator-network.org/library/reporting-guidelines-under-developmentLinks ]

Field, N.; Cohen, T.; Struelens, M. J.; Palm, D.; Cookson, B.; Glynn, J. R.; Gallo, V.; Ramsay, M.; Sonnenberg, P.; Maccannell, D.; et al. Strengthening the Reporting of Molecular Epidemiology for Infectious Diseases (STROME-ID): an extension of the STROBE statement. Lancet Infect. Dis., 14(4):341-52, 2014. [ Links ]

Gagnier, J. J.; Kienle, G.; Altman, D. G.; Moher, D.; Sox, H.; Riley, D. & CARE Group. The CARE Guidelines: Consensus-based Clinical Case Reporting Guideline Development. Glob. Adv. Health Med., 2(5):3845, 2013. [ Links ]

Harbin Consensus Conference Workshop Group; Legro, R. S.; Wu, X.; Barnhart, K. T.; Farquhar, C.; Fauser, B. C. & Mol, B. Improving the reporting of clinical trials of infertility treatments (IMPRINT): modifying the CONSORT statement. Hum. Reprod., 29(10):2075-82, 2014. [ Links ]

Hemming, K.; Taljaard, M.; McKenzie, J. E.; Hooper, R.; Copas, A.; Thompson, J. A.; Dixon-Woods, M.; Aldcroft, A.; Doussau, A.; Grayling, M.; et al. Reporting of stepped wedge cluster randomised trials: extension of the CONSORT 2010 statement with explanation and elaboration. BMJ, 363:k1614, 2018. [ Links ]

Hoffmann, T. C.; Glasziou, P. P.; Boutron, I.; Milne, R.; Perera, R.; Moher, D.; Altman, D. G.; Barbour, V.; Macdonald, H.; Johnston, M.; et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ, 348:g1687, 2014. [ Links ]

Husereau, D.; Drummond, M.; Petrou, S.; Carswell, C.; Moher, D.; Greenberg, D.; Augustovski, F.; Briggs, A. H.; Mauskopf, J.; Loder, E.; et al. Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement. Value Health, 16(2):e1-5, 2013. [ Links ]

Hutton, B.; Salanti, G.; Caldwell, D. M.; Chaimani, A.; Schmid, C. H.; Cameron, C.; Ioannidis, J. P.; Straus, S.; Thorlund, K.; Jansen, J. P.; et al. The PRISMA extension statement for reporting of systematic reviews incorporating network meta-analyses of health care interventions: checklist and explanations. Ann. Intern. Med., 162(1):777-84, 2015. [ Links ]

Janssens, A. C.; Ioannidis, J. P.; van Duijn, C. M.; Little, J.; Khoury, M. J. & GRIPS Group. Strengthening the reporting of genetic risk prediction studies: the GRIPS statement. Eur. J. Clin. Invest., 41(9):1004-9, 2011. [ Links ]

Juszczak, E.; Altman, D. G.; Hopewell, S. & Schulz, K. Reporting of MultiArm Parallel-Group Randomized Trials: Extension of the CONSORT 2010 Statement. JAMA, 321(16):1610-20, 2019. [ Links ]

Kamioka, H.; Kawamura, Y.; Tsutani, K.; Maeda, M.; Hayasaka, S.; Okuizum, H.; Okada, S.; Honda, T. & Iijima, Y. A checklist to assess the quality of reports on spa therapy and balneotherapy trials was developed using the Delphi consensus method: the SPAC checklist. Complement. Ther. Med., 21(4):324-32, 2013. [ Links ]

Kapadia, M. Z.; Askie, L.; Hartling, L.; Contopoulos-Ioannidis, D.; Bhutta, Z. A.; Soll, R.; Moher, D. & Offringa, M. PRISMA-Children (C) and PRISMA-Protocol for Children (P-C) Extensions: a study protocol for the development of guidelines for the conduct and reporting of systematic reviews and meta-analyses of newborn and child health research. BMJ Open, 6:e010270, 2016. [ Links ]

Kilkenny, C.; Browne, W. J.; Cuthill, I. C.; Emerson, M. & Altman, D. G. Improving bioscience research reporting: the ARRIVE guidelines for reporting animal research. PLoS Biol., 8(6):e1000412, 2010. [ Links ]

Kottner, J.; Audigé, L.; Brorson, S.; Donner, A.; Gajewski, B. J.; Hróbjartsson, A.; Roberts, C.; Shoukri, M. & Streiner, D. L. Guidelines for Reporting Reliability and Agreement Studies (GRRAS) were proposed. J. Clin. Epidemiol., 64(1):96-106, 2011. [ Links ]

Kwakkenbos, L.; Juszczak, E.; Hemkens, L. G.; Sampson, M.; Fröbert, O.; Relton, C.; Gale, C.; Zwarenstein, M.; Langan, S. M.; Moher, D.; et al. Protocol for the development of a CONSORT extension for RCTs using cohorts and routinely collected health data. Res. Integr. Peer Rev., 3:9, 2018. [ Links ]

Lachat, C.; Hawwash, D.; Ocké, M. C.; Berg, C.; Forsum, E.; Hörnell, A.; Larsson, C.; Sonestedt, E.; Wirfält, E.; Åkesson, A.; et al. Strengthening the Reporting of Observational Studies in Epidemiology-Nutritional Epidemiology (STROBE-nut): An Extension of the STROBE Statement. PLoS Med., 13(6):e1002036, 2016. [ Links ]

Little, J.; Higgins, J. P.; Ioannidis, J. P.; Moher, D.; Gagnon, F.; von Elm, E.; Khoury, M. J.; Cohen, B.; Davey-Smith, G.; Grimshaw, J.; et al. Strengthening the Reporting of Genetic Association Studies (STREGA)-An extension of the STROBE statement. Genet. Epidemiol., 33(7):58198, 2009. [ Links ]

Lucas, N. P.; Macaskill, P.; Irwig, L. & Bogduk, N. The development of a quality appraisal tool for studies of diagnostic reliability (QAREL). J. Clin. Epidemiol., 63(8):854-61, 2010. [ Links ]

Lucas, N.; Macaskill, P.; Irwig, L.; Moran, R.; Rickards, L.; Turner, R. & Bogduk, N. The reliability of a quality appraisal tool for studies of diagnostic reliability (QAREL). BMC Med. Res. Methodol., 13:111, 2013. [ Links ]

MacPherson, H.; White, A.; Cummings, M.; Jobst, K.; Rose, K. & Niemtzow, R. Standards for reporting interventions in controlled trials of acupuncture: the STRICTA recommendations. Complement. Ther. Med., 9(4):246-9, 2001. [ Links ]

Madurasinghe, V. W. & Sandra Eldridge on behalf of MRC START Group and Gordon Forbes on behalf of the START Expert Consensus Group. Guidelines for reporting embedded recruitment trials. Trials, 17:27, 2016. [ Links ]

Manterola, C. & Otzen, T. Checklist for reporting results using observational descriptive studies as research designs. The MInCir initiative. Int. J. Morphol., 35(1):72-6, 2017. [ Links ]

Manterola, C.; Cartes-Velásquez, R.; Burgos, M. E.; Sanhueza, A. & Otzen, T. Development and initial validation of a scale to measure methodological quality in diagnostic accuracy studies. The MInCir proposal. Int. J. Morphol., 36(2):743-9, 2018. [ Links ]

McGowan, J.; Sampson, M.; Salzwedel, D. M.; Cogo, E.; Foerster, V. & Lefebvre, C. PRESS Peer Review of Electronic Search Strategies: 2015 Guideline Statement. J. Clin. Epidemiol., 75:40-6, 2016. [ Links ]

McInnes, M. D. F.; Moher, D.; Thombs, B. D.; McGrath, T. A.; Bossuyt, P. M.; PRISMA-DTA Group; Clifford, T.; Cohen, J. F.; Deeks, J. J.; Gatsonis, C.; et al. Preferred reporting items for a systematic review and meta-analysis of diagnostic test accuracy studies: The PRISMADTA Statement. JAMA, 319(4):388-96, 2018. [ Links ]

McShane, L. M.; Altman, D. G.; Sauerbrei, W.; Taube, S. E.; Gion, M.; Clark, G. M. & Statistics Subcommittee of the NCI-EORTC Working Group on Cancer Diagnostics. Reporting recommendations for tumor marker prognostic studies (REMARK). Br. J. Cancer, 97(16):1180-4, 2005. [ Links ]

Moher, D.; Cook, D. J.; Eastwood, S.; Olkin, I.; Rennie, D. & Stroup, D. F. Improving the quality of reports of meta-analyses of randomised controlled trials: the QUOROM statement. Quality of Reporting of Meta-analyses. Lancet, 354(9193):1896-900, 1999. [ Links ]

Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D. G. & PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med., 6(7):e1000097, 2009. [ Links ]

Moher, D.; Schulz, K. F. & Altman, D. G. The CONSORT statement: revised recommendations for improving the quality of reports of parallel-group randomised trials. Lancet, 357(9263):1191-4, 2001. [ Links ]

Moher, D.; Shamseer, L.; Clarke, M.; Ghersi, D.; Liberati, A.; Petticrew, M.; Shekelle, P.; Stewart, L. A. & PRISMA-P Group. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. Syst. Rev., 4:1, 2015. [ Links ]

Montgomery, P.; Grant, S.; Hopewell, S.; Macdonald, G.; Moher, D.; Michie, S. & Mayo-Wilson, E. Protocol for CONSORT-SPI: an extension for social and psychological interventions. Implement. Sci., 8:99, 2013. [ Links ]

Montgomery, P.; Grant, S.; Mayo-Wilson, E.; Macdonald, G.; Michie, S.; Hopewell, S.; Moher, D. & CONSORT-SPI Group. Reporting randomised trials of social and psychological interventions: the CONSORT-SPI 2018 Extension. Trials, 19(1):407, 2018. [ Links ]

Motoo, Y.; Hakamatsuka, T.; Kawahara, N.; Arai, I. & Tsutani, K. Standards of Reporting Kampo Products (STORK) in research articles. J. Integr. Med., 15(3):182-5, 2017. [ Links ]

Nicholls, S. G.; Langan, S. M.; Sørensen, H. T.; Petersen, I. & Benchimol, E. I. The RECORD reporting guidelines: meeting the methodological and ethical demands of transparency in research using routinelycollected health data. Clin. Epidemiol., 8:389-92, 2016. [ Links ]

No authors listed. CONSORT 2010. Lancet, 375(9721):1136, 2010. [ Links ]

O’Brien, B. C.; Harris, I. B.; Beckman, T. J.; Reed, D. A. & Cook, D. A. Standards for reporting qualitative research: a synthesis of recommendations. Acad. Med., 89(9):1245-51, 2014. [ Links ]

Ogrinc, G.; Davies, L.; Goodman, D.; Batalden, P.; Davidoff, F. & Stevens, D. SQUIRE 2.0 (Standards for QUality Improvement Reporting Excellence): revised publication guidelines from a detailed consensus process. BMJ Qual. Saf., 25(12):986-92, 2016. [ Links ]

Pandis, N.; Chung, B.; Scherer, R. W.; Elbourne, D. & Altman, D. G. CONSORT 2010 statement: extension checklist for reporting within person randomised trials. Br. J. Dermatol., 180(3):534-52, 2019. [ Links ]

Pandis, N.; Fleming, P. S.; Hopewell, S. & Altman, D. G. The CONSORT Statement: Application within and adaptations for orthodontic trials. Am. J. Orthod. Dentofacial Orthop., 147(6):663-79, 2015. [ Links ]

Phillips, A. C.; Lewis, L. K.; McEvoy, M. P.; Galipeau, J.; Glasziou, P.; Hammick, M.; Moher, D.; Tilson, J. & Williams, M. T. Protocol for development of the guideline for reporting evidence based practice educational interventions and teaching (GREET) statement. BMC Med. Educ., 13:9, 2013. [ Links ]

Phillips, A. C.; Lewis, L. K.; McEvoy, M. P.; Galipeau, J.; Glasziou, P.; Moher, D.; Tilson, J. K. & Williams, M. T. Development and validation of the guideline for reporting evidence-based practice educational interventions and teaching (GREET). BMC Med. Educ., 16:237, 2016. [ Links ]

Pinnock, H.; Barwick, M.; Carpenter, C. R.; Eldridge, S.; Grandes, G.; Griffiths, C. J.; Rycroft-Malone, J.; Meissner, P.; Murray, E.; Patel, A.; et al. Standards for Reporting Implementation Studies (StaRI) Statement. BMJ, 356:i6795, 2017. [ Links ]

Ramsey, S. D.; Willke, R. J.; Glick, H.; Reed, S. D.; Augustovski, F.; Jonsson, B.; Briggs, A. & Sullivan, S. D. Cost-effectiveness analysis alongside clinical trials II-An ISPOR Good Research Practices Task Force report. Value Health, 18(2):161-72, 2015. [ Links ]

Ramsey, S.; Willke, R.; Briggs, A.; Brown, R.; Buxton, M.; Chawla, A.; Cook, J.; Glick, H.; Liljas, B.; Petitti, D.; et al. Good research practices for cost-effectiveness analysis alongside clinical trials: the ISPOR RCTCEA Task Force report. Value Health, 8(5):521-33, 2005. [ Links ]

Rethlefsen, M.; Koffel, J. & Kirtley, S. PRISMA-Search: guidelines for reporting systematic review literature searches. In: Equator Network. Web Site. Oxford, The EQUATOR Network and UK EQUATOR Centre, Centre for Statistics in Medicine, Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences (NDORMS), University of Oxford , 2016. Available from: http://www.equator-network.org/library/reporting-guidelines-underdevelopment/#57Links ]

Sargeant, J. M.; O'Connor, A. M.; Dohoo, I. R.; Erb, H. N.; Cevallos, M.; Egger, M.; Ersbøll, A. K.; Martin, S. W.; Nielsen, L. R.; Pearl, D. L.; et al. Methods and processes of developing the strengthening the reporting of observational studies in epidemiology - veterinary (STROBE-Vet) statement. Prev. Vet. Med., 134:188-96, 2016. [ Links ]

Sargeant, J. M.; O’Connor, A. M.; Gardner, I. A.; Dickson, J. S.; Torrence, M. E. & Consensus Meeting Participants. The REFLECT statement: reporting guidelines for Randomized Controlled Trials in livestock and food safety: explanation and elaboration. Zoonoses Public Health, 57(2):105-36, 2010. [ Links ]

Shea, B. J.; Grimshaw, J. M.; Wells, G. A.; Boers, M.; Andersson, N.; Hamel, C.; Porter, A. C.; Tugwell, P.; Moher, D. & Bouter, L. M. Development of AMSTAR: a measurement tool to assess the methodological quality of systematic reviews. BMC Med. Res. Methodol., 7:10, 2007. [ Links ]

Shea, B. J.; Reeves, B. C.; Wells, G.; Thuku, M.; Hamel, C.; Moran, J.; Moher, D.; Tugwell, P.; Welch, V.; Kristjansson, E.; et al. AMSTAR 2: a critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. BMJ, 358:j4008, 2017. [ Links ]

Singh, J. P. Development of the Metareview Assessment of Reporting Quality (MARQ) Checklist. Rev. Fac. Med., 60:285-92, 2012. [ Links ]

Smith, A. J.; Clutton, R. E.; Lilley, E.; Hansen, K. E. A. & Brattelid, T. PREPARE: guidelines for planning animal research and testing. Lab. Anim., 52(2):135-41, 2018. [ Links ]

Stang, A. Critical evaluation of the Newcastle-Ottawa scale for the assessment of the quality of nonrandomized studies in meta-analyses. Eur. J. Epidemiol., 25(9):603-5, 2010. [ Links ]

Sterne, J. A.; Hernán, M. A.; Reeves, B. C.; Savovic´, J.; Berkman, N. D.; Viswanathan, M.; Henry, D.; Altman, D. G.; Ansari, M. T.; Boutron, I.; et al. ROBINS-I: a tool for assessing risk of bias in non-randomised studies of interventions. BMJ, 355:i4919, 2016. [ Links ]

Stevens, A. PRISMA-RR: an extension to PRISMA for rapid reviews. In: Equator Network. Web Site. Oxford, The EQUATOR Network and UK EQUATOR Centre, Centre for Statistics in Medicine, Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences (NDORMS), University of Oxford , 2015. Available from: http://www.equator-network.org/library/reporting-guidelines-underdevelopment/#51Links ]

Stevens, G. A.; Alkema, L.; Black, R. E.; Boerma, J. T.; Collins, G. S.; Ezzati, M.; Grove, J. T.; Hogan, D. R.; Hogan, M. C.; Horton, R.; et al. Guidelines for Accurate and Transparent Health Estimates Reporting: the GATHER statement. Lancet, 388(10062):e19-e23, 2016. [ Links ]

Stewart, L. A.; Clarke, M.; Rovers, M.; Riley, R. D.; Simmonds, M.; Stewart, G.; Tierney, J. F. & PRISMA-IPD Development Group. Preferred Reporting Items for Systematic Review and Meta-Analyses of individual participant data: the PRISMA-IPD Statement. JAMA, 313(16):1657-65, 2015. [ Links ]

Stock-Schröer, B.; Albrecht, H.; Betti, L.; Endler, P. C.; Linde, K.; Lüdtke, R.; Musial, F.; van Wijk, R.; Witt, C. & Baumgartner, S. Reporting experiments in homeopathic basic research (REHBaR)--a detailed guideline for authors. Homeopathy, 98(4):287-98, 2009. [ Links ]

Stone, S. P.; Cooper, B. S.; Kibbler, C. C.; Cookson, B. D.; Roberts, J. A.; Medley, G. F.; Duckworth, G.; Lai, R.; Ebrahim, S.; Brown, E. M.; et al. The ORION statement: guidelines for transparent reporting of outbreak reports and intervention studies of nosocomial infection. Lancet Infect. Dis., 7(4):282-8, 2007. [ Links ]

Stroup, D. F.; Berlin, J. A.; Morton, S. C.; Olkin, I.; Williamson, G. D.; Rennie, D.; Moher, D.; Becker, B. J.; Sipe, T. A. & Thacker, S. B. Meta-analysis of observational studies in epidemiology: a proposal for reporting. Meta-analysis Of Observational Studies in Epidemiology (MOOSE) group. JAMA, 283(15):2008-12, 2000. [ Links ]

Tomaszewski, K. A.; Henry, B. M.; Kumar Ramakrishnan, P.; Roy, J.; Vikse, J.; Loukas, M.; Tubbs, R. S. & Walocha, J. A. Development of the Anatomical Quality Assurance (AQUA) checklist: Guidelines for reporting original anatomical studies. Clin. Anat., 30(1):14-20, 2017. [ Links ]

Tong, A.; Flemming, K.; McInnes, E.; Oliver, S. & Craig, J. Enhancing transparency in reporting the synthesis of qualitative research: ENTREQ. BMC Med. Res. Methodol., 12:181, 2012. [ Links ]

Tong, A.; Sainsbury, P. & Craig, J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int. J. Qual. Health Care, 19(6):349-57, 2007. [ Links ]

Tricco, A. C.; Lillie, E.; Zarin, W.; O'Brien, K. K.; Colquhoun, H.; Levac, D.; Moher, D.; Peters, M. D. J.; Horsley, T.; Weeks, L.; et al. PRISMA Extension for Scoping Reviews (PRISMA-ScR): Checklist and Explanation. Ann. Intern. Med., 169(7):467-73, 2018. [ Links ]

Vohra, S.; Shamseer, L.; Sampson, M.; Bukutu, C.; Schmid, C. H.; Tate, R.; Nikles, J.; Zucker, D. R.; Kravitz, R.; Guyatt, G.; et al. CONSORT extension for reporting N-of-1 trials (CENT) 2015 Statement. J. Clin. Epidemiol., 76:9-17, 2016. [ Links ]

von Elm, E.; Altman, D. G.; Egger, M.; Pocock, S. J.; Gøtzsche, P. C.; Vandenbroucke, J. P. & STROBE Initiative. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. Lancet, 370(9596):1453-7, 2007. [ Links ]

Weijer, C.; Grimshaw, J. M.; Eccles, M. P.; McRae, A. D.; White, A.; Brehaut, J. C.; Taljaard, M. & Ottawa Ethics of Cluster Randomized Trials Consensus Group. The Ottawa Statement on the ethical design and conduct of cluster randomized trials. PLoS Med., 9(11):e1001346, 2012. [ Links ]

Welch, V.; Petticrew, M.; Petkovic, J.; Moher, D.; Waters, E.; White, H.; Tugwell, P. & PRISMA-Equity Bellagio group. Extending the PRISMA statement to equity-focused systematic reviews (PRISMA-E 2012): explanation and elaboration. J. Clin. Epidemiol., 70:68-89, 2016. [ Links ]

Welch, V.; Petticrew, M.; Tugwell, P.; Moher, D.; O'Neill, J.; Waters, E.; White, H. & PRISMA-Equity Bellagio group. PRISMA-Equity 2012 extension: reporting guidelines for systematic reviews with a focus on health equity. PLoS Med., 9(10):e1001333, 2012. [ Links ]

Whiting, P. F.; Rutjes, A. W.; Westwood, M. E.; Mallett, S.; Deeks, J. J.; Reitsma, J. B.; Leeflang, M. M.; Sterne, J. A.; Bossuyt, P. M. & QUADAS-2 Group. QUADAS-2: a revised tool for the quality assessment of diagnostic accuracy studies. Ann. Intern. Med., 155(8):529-36, 2011. [ Links ]

Whiting, P.; Rutjes, A. W.; Reitsma, J. B.; Bossuyt, P. M. & Kleijnen, J. The development of QUADAS: a tool for the quality assessment of studies of diagnostic accuracy included in systematic reviews. BMC Med. Res. Methodol., 3:25, 2003. [ Links ]

Whiting, P.; Savovic, J.; Higgins, J. P.; Caldwell, D. M.; Reeves, B. C.; Shea, B.; Davies, P.; Kleijnen, J.; Churchill, R. & ROBIS group. ROBIS: A new tool to assess risk of bias in systematic reviews was developed. J. Clin. Epidemiol., 69:225-34, 2016. [ Links ]

Zimovetz, E. & Wolowacz, S. Reviewer’s Checklist for Assessing the Quality of Decision Models. In: ISPOR 12th Annual European Congress. Abstracts. Value Health, 12(7):A221-481, 2009. [ Links ]

Received: October 03, 2019; Accepted: December 28, 2019

* Correspondence to: E-mail: carlos.manterola@ufrontera.cl

Corresponding author: Prof. Dr. Carlos Manterola, MD, PhD. Department of Surgery and CEMyQ Universidad de La Frontera, Temuco - CHILE.

Creative Commons License This is an open-access article distributed under the terms of the Creative Commons Attribution License