SciELO - Scientific Electronic Library Online

 
vol.41 número3¿Quién dijo no? Comportamiento electoral en el referéndum constitucional de Bolivia en 2016Coberturas mediáticas, polarización y reformas educativas en España índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Journal

Artigo

Indicadores

Links relacionados

  • Em processo de indexaçãoCitado por Google
  • Não possue artigos similaresSimilares em SciELO
  • Em processo de indexaçãoSimilares em Google

Compartilhar


Revista de ciencia política (Santiago)

versão On-line ISSN 0718-090X

Rev. cienc. polít. (Santiago) vol.41 no.3 Santiago dez. 2021

http://dx.doi.org/10.4067/S0718-090X2021005000108 

Artículos

Documentation Requirements in Political Science Journals: Moving Towards Open Access Practices

Documentación requerida en revistas de Ciencia Política: hacia prácticas de acceso abierto

Carolina Curvale1 

Carolina Curvale is a research professor in political science at Facultad Latinoamericana de Ciencia Sociales, Ecuador. She obtained her Ph.D. and master's degree in political science at New York University and her BA in political science at Universidad de San Andrés, Argentina. Her main areas of research are comparative politics, political economy, and Latin American politics. Her research has been published by the Oxford University Press and the United Nations Economic Commission for Latin America and the Caribbean, among others. E-mail: ccurvale@flacso.edu.ec


http://orcid.org/0000-0002-3493-9855

Gustavo Pérez-Arrobo2 

Gustavo Pérez-Arrobo is a research associate in the Political Studies Department at Facultad Latinoamericana de Ciencias Sociales, Ecuador. He holds a master´s degree in comparative politics from Facultad Latinoamericana de Ciencias Sociales, Ecuador and a BSc in economics from Pontificia Universidad Católica del Ecuador. He will join the Ph.D. program in Political Science at the University of Colorado Boulder beginning fall 2021. His main areas of research are political economy of development, research methodology in political science and Latin American politics. E-mail; gaperezfl@flacso.edu.ec.


http://orcid.org/0000-0003-0162-1821

1Facultad Latinoamericana de Ciencias Sociales (FLACSO), Ecuador

2Facultad Latinoamericana de Ciencias Sociales (FLACSO), Ecuador

ABSTRACT

This article explores the incidence of open science in academic research in the field of Political Science. Open science's goal is to shed light on information about data, research procedures, and results of academic work, thus making information ac- cessible to reviewers and the general public. This practice is not prevalent across the social sciences although there is increasing interest on the importance of re- producibility in building or rejecting theory and knowledge. However, important epistemological and methodological debates have evolved around the feasibility and desirability of adopting these practices as a standard in the discipline. This article systematically collects and analyzes data on publishing requirements of the top journals in the fields at point, including pre-registration. We also provide a state of the art on the implementation of these best practices in Latin America.

Keywords: Open Access; open science; publishing requirements; pre-registration of hypothesis; best practices

RESUMEN

Este artículo explora la incidencia de la ciencia abierta en la investigación académica en el campo de la ciencia política. El objetivo de Open Science es hacer transparente la infor- mación sobre datos, procedimientos de investigación y resultados del trabajo académico, haciendo que la información sea accesible para los revisores y el público en general. Esta práctica no es predominante en las ciencias sociales, aunque existe un creciente interés sobre la importancia de la reproducibilidad en la construcción o rechazo de la teoría y el conocimiento. Sin embargo, importantes debates epistemológicos y metodológicos han sur- gido sobre la factibilidad y deseabilidad de adoptar estas prácticas como estándares en la disciplina. Este artículo recopila y analiza sistemáticamente datos sobre los requisitos de publicación de las principales revistas en los campos en cuestión, incluida la preinscripción. También ofrecemos un estado del arte en la implementación de estas prácticas en América Latina.

Palabras clave: acceso abierto; ciencia abierta; requisitos de publicación; pre-registro de hipótesis; mejores prácticas

I. INTRODUCTION

The preoccupation to raise our disciplinary standards and adopt best practices to ensure transparency in empirical research has been an ongoing conversation that dates back to Gary King's (1995) landmark article on replication. Contrary to what has occurred in other disciplines, political science had not undergone major crisis regarding the credibility of research; neither had there been cases of fraud or journal article retractions until 2015. The LaCour scandal changed that path and introduced our science into a new discussion (Foster 2015; Young and Janz 2015; Findley et al. 2016).1 Scholars in the social sciences have produced, in the past decade, a number of articles that address the need for more transparent empirical research practices (see, for example, Bowers and Voors 2016), in particular those that prevent p-fishing and hacking and that allow for replication. Progress has been made in regard to estimating the magnitude of the problem, even when one departs from the assumption that researchers seek to produce relevant, solid work - which is also publishable. For example, Simmons, Nelson, and Simonsohn show “how unacceptably easy it is to accumulate (and report) statistically significant evidence for a false hypothesis” (2011: 1359).

The American Political Science Association (APSA) has taken a leading role in promoting the incorporation of transparency in research practices into the profession. In 2012 the APSA Council amended its Guide to Professional Ethics in Political Science to incorporate Data Access and Research Transparency (DA-RT) principles that had been prepared by an ad-hoc committee. Additionally, 27 journals have signed a 2014 statement on transparency (JETS)2 committing themselves to take specific steps in order to achieve transparency by January 15, 2016 - including requiring authors to make data available. A number of symposia, panel discussions, and journal special issues on the topic continued the discussion given that there were many concerns about how this would be implemented and how it would affect the costs and burden of doing research.3 A line of divide was the methodological approach employed, as the feasibility and costs of sharing data and metadata could vary significantly depending on the type of study. This methodological divide resulted in two separate drafts for transparency depending on whether the study was qualitative or quantitative – an old divide in the discipline (Guidelines for Data Access and Research Transparency for Qualitative Research in Political Science and Guidelines for Data Access and Research Transparency for Quantitative Research in Political Science). Others proposed that the problem had a heuristic nature, being at the core of the issue whether the approach employed in research was positivist or non-positivist (see, for example, Isaac 2015). These are undoubtedly important and controversial issues that have been discussed at length elsewhere and do not constitute the core of this piece.

However, there are two specific topics that are relevant for this paper which are part of the discussions that divide the discipline. First, the meaning of transparency may not be the same in the quantitative and qualitative research traditions, as APSA's Qualitative Transparency Deliberations have made clear. While it may be feasible for quantitative researchers to share datasets and files after publication, a qualitative researcher may be bound by ethical and legal considerations in deciding whether to publicly expose transcripts of interviews in complex contexts, on how to share information about the interpersonal nature of field research (see Tripp (2018) for a detailed analysis). Additionally, the debate over the potential adoption of universal research practices has led to rethinking the meaning of producing knowledge and what our role as political scientists is (Lenine and Mörschbächer 2019).

The second crucial topic has to do with the desirability and feasibility of replication. There is an important distinction between “reproducibility” and “replication.”4 While political scientists share and interest in producing quality research, sometimes replication is simply impossible: How can the experience of ethnography be replicated? Would doing so would even make sense? Perhaps it would be important to consider an approach that focuses on the rigor of the process of documenting field research or collecting data in general instead of imposing a unique standard for all. Replication seems to be most suitable for quantitative research which adds value to the enterprise of teaching research methods and helps correct for potential errors that may intentionally or unintentionally affect the integrity of research. This being said, we recognize that the information that we are about to systematize tend to refer to a narrow understanding of transparency and reproducibility (as replication) that better relate to quantitative analysis; however, this is not the authors’ bias, but a trend that is indicative of the status quo. We believe that having information on compliance on best practices, even if they are narrowly understood, is a first step towards disciplining our thoughts and organizing future discussions.

The increasing use of big data in Political Science applications make the process of building best research practices even more relevant. The application of geographically referenced data in the social sciences is also in an upward trend and opens up new avenues for research. Including space distribution as a new dimension of analysis has proven most helpful in electoral studies but also in sociological studies of urban-rural distribution of populations, while at the same time taking into consideration other aspects of the social and political phenomena in question. Hence, the protocols on how we process and analyze data are crucial to producing solid, credible research.

In this article we attempt to contribute to the ongoing discussion in the social sciences regarding the important issues of open science, replication, and pre-registration practices by looking into how these practices have been implemented in the past few years, drawing on data available online from different sources. Unfortunately, the data has proven to be scarce, as will be shown below, but we were able to learn about trends in the changing practices of the political science field.

The article proceeds as follows. Section II focuses on pre-registration practices in the field and tracks withdrawn articles from journals. The following section reviews a selection of journal requirements regarding the submission of replication files, offers a preview of voluntary posting of replication files at major American universities, and presents stats on journal's open access policies. Section IV provides a regional focus of publishing trends in Latin American based journals.

II. PRE-REGISTRATION PRACTICES AND WITHDRAWN ARTICLES

Humphreys, Sanchez de la Sierra, and van der Windt (2013) have made a compelling case for nonbinding research registration of experimental and observational work. They are particularly concerned about the problem of fishing and warn that the results that do get reported may induce bias, as researchers have incentives to show positive results selecting models that produce p-values suitable for publication. Incentives to obtain positive results are modeled both by the publishing practices and standards, and also by the decisions that researchers make - even when well-intentioned - in order to fall into the acceptable results for publication (i.e., Gerber, Green, and Nickerson 2001). Under the maxim “publish or perish,” researchers, editors and reviewers set standards that may not always produce the most desirable outcomes. Silva (2019) cautions about possible malpractices in publishing decisions of Political Science randomized trials based on whether covariates are balanced, suggesting that overzealous reviewers and researcher's ex-post data manipulation affect the production of knowledge in the discipline.

One of the main benefits of pre-registration is that of reducing publication bias, considering that: 1) the academic community learns about a well-designed study regardless of the null hypothesis significance tests results, 2) a registration process would allow scholars to submit high quality articles with null results for publication,5 3) offers in advance a well-reasoned criteria to stop data collection, and 4) registering a model specification prior to conducting the empirical analysis offers a most honest and theory-grounded method for hypothesis testing (Monogan 2013, 2014). Another of the many possible sources of publication bias is the impact that the institutional or country affiliations of authors may have on editors, which is why Nyhan (2015) proposed a triple-blind reviewing process, where the identity of authors is concealed even to editors while the article is being evaluated.

An extended and rigid implementation of these practices, however, may prevent a researcher from making legitimate adjustments to the study or sample size that may be appropriate while conducting research (Tucker 2014). Who draws the line between appropriate deviations from a pre-registered design and practices that are related to p-fishing? Citing that political science research fluctuates between deduction and induction and the importance of conditions that favor one or another theoretical proposal, Laitin (2013) argues in favor of promoting the publication of replication studies and studies with null findings instead of focusing on a discipline-wide registry. And perhaps we are moving towards accepting that knowledge of failed experiments may teach us something. On August 1, 2018, the editors of the Journal of Political Science Education invited submission for a special issue revolving around ideas that did not meet expectations, citing that “while negative results rarely get reported in academia, they are even more important when it comes to how we teach our students.”6

Pre-registration of hypothesis and research design in general is largely voluntary in Political Science. We checked the Political Science Registered Studies Dataverse (https://dataverse.harvard.edu/dataverse/registration) in order to get an idea of how the practice of pre-registering articles is being implemented in the field. This registry contains information ranging from 2009 onwards. As table 1 shows, the number of pre-registered articles has tended to increase during the period, reaching a maximum in 2018. Although the past few years have shown an increase in use of the registry, the number of pre-registered studies remains remarkably low.

Table 1 Pre-registered studies in Political Science Registered Studies Dataverse 

Year Number of studies
2009 1
2010 0
2011 1
2012 1
2013 0
2014 0
2015 1
2016 9
2017 13
2018 21
2019 9
2020 8

Source: Own compilation from data in the Political Science Registered Studies Dataverse.

Among a total of 64 pre-registered articles, 14 were published. 19 of those articles are authored by scholars based in European institutions, 37 are based in the United States and the remaining eight are the result of partnerships across institutions that include at least a European, an American, or a Latin America partner. Hence, it appears that the use of the Political Science Registered Studies Dataverse is yet limited but predominantly used by American-based scholars.

As figure 1 above shows, out of the 64 studies registered, 11 were on classical topics in American Politics, 15 on Public Policy, nine referred to International Relations subject matters, eight to Political Psychology, seven to Political Methodology, five to Public Opinion and the other nine were inscribed in the areas of elections, comparative politics, or political economy. In Dataverse, 23 of the 64 registered studies were experiments in different subfields of the discipline: two of them were published, but they were only registered between 2016 and 2020. As of December 2020, the Open Science Framework reported over 62,625 pre-registration records, from different fields (and quite possibly many are from students taking quantitative courses); it was not possible to track down fields or whether they were eventually published.7

Source: Own compilation from data in the Political Science Registered Studies Dataverse.

Figure 1 Pre-registered studies by subfield, 2009-2020 

While adopting voluntary pre-registration practices appears to be a slow process, coming up with a minimum consensus that can be translated into publishing requirements appears to be more challenging. If it were possible to coordinate on appropriate standards for pre-registration, eventually the practices could become the norm. Developing standards that incorporate changing contexts and research topic specificities may allow for certain flexibility without falling into the trap of masking inductive research as deductive research. Nevertheless, the lack of incentives to do so may continue to delay the process.

Withdrawn Articles in the Field

New online initiatives attempt to keep a record of articles that journals withdraw. The websites: https://www.bitss.org/ and https://retractionwatch.com/ post articles on this subject, covering a wide range of subjects. We inspected the contents of the web site and were able to find information on retracted articles in our discipline as of December 2020. We found seven withdrawn articles8 from the field of political science. The reasons for withdrawal were: limited or no information (three), plagiarism (two), concerns about the data and the results (one), and we could not find the journal's justification for the removal in the remaining case. While there is no comprehensive registry, the practice of making public an editorial decision to remove articles is slowly becoming more common.

III. REPLICATION REQUIREMENTS AND OPEN ACCESS

Dating back to the 1995 PS: Political Science and Politics issue on replication in the social sciences, the case for transparency in sharing data and documentation has gathered widespread support in the discipline, particularly among researchers who work with large N data. Allan Dafoe (2014: 63) recommended the adoption of the following transparency maxim: “Good research involves publishing complete replication files, making every step of research as explicit and reproducible as is practical.” Among the advantages of sharing replication files are that papers receive more citations and are more visible (Gleditsch, Metelits, and Strand 2003), although it could be argued that prominence is the result of high quality, which may be merely correlated with offering replication files. While the desirability of achieving higher standards of transparency has not been questioned, a number of researchers have pointed to the inability to share proprietary data or reveal the identity of informants, and other specific cases that would require flexibility in the criteria employed to enforce access to information needed to replicate studies.

The top 20 Political Science journals, as per the 2020 Journal Citation Report ranking, fluctuate from the score 5.912 to 3.069, with Political Communication ranked at the top 65% of these journals are published quarterly, and 19 published unsolicited articles, while the Annual Review of Political Science is the only one that publishes commissioned articles. The author guidelines of these top 20 journals revealed that:

  • 15 journals explicitly ask authors to submit the databases employed along with the manuscript. In some cases, publication is conditional on compliance with this requirement.

  • 14 require sending replication files to reproduce results. Three of those journals provide a detailed list of the types of files that authors must submit: 1) an archive describing all the files sent (read me), 2) a file containing the databases or other representations of information linked to the study, 3) a file indicating the software employed, the code used if it were the case, and the files with the coding or syntax to replicate the results.

  • Six do not provide specifications regarding replication files.

Hence, about 70% of the top 20 journals in the field have rules on how authors must submit replication files from which analyses may be reproduced.

With regard to Open Access, 19 journals established that allowing for open access is up to the authors of the article; open access options are based on the journals’ different types of subscription contracts.9 10 As has been highlighted elsewhere, four of the top 20 ranked political science journals that recommends - although do not require - the preregistration of hypothesis.11 Transparency and open access may, however, have contradictory trajectories of evolution. While replication materials are increasingly available to the general public free of charge, the actual article that is published may be subject to journal subscription-only access (such as Jstor or Sage).

Best practices indicate that novel data is usually available to the research community about a year after it was originally published. It is very possible that, even when peer-reviewed journals may not require the submission of replication materials, scholars voluntarily do so on their academic web pages. Hence, we looked into the pages of full-time professors based at the top five Political Science and International Studies departments according to the QS 2020 ranking.12 Out of a total of 245 faculty members, we found that 23.3% posted at least some datasets employed in their publications and 70.2% gave access to at least one of their articles and/or drafts. An interested reader could email a scholar in order to request a dataset, but of course the outcome of this process is not easily observed. All that can be said is that paper drafts tend to be available online while datasets are much less frequently found online, and even less so the software coding needed to replicate a published paper.

There are interdisciplinary initiatives to replicate published studies, most notably ReplicationWiki (http://replication.uni-goettingen.de/wiki/index.php/Main_Page), which hosts a number of replication works in the field of Economics. We draw information on another initiative, the Dataverse Project at https://dataverse.harvard.edu, which is an open-source research data repository software with 69 installations around the world hosting over 4,444 dataverses (defined as sets of data, documentation, and metadata) and over 106,350 Datasets as of December 2020. The project aims to “facilitate the public distribution of persistent, authorized, and verifiable data, with powerful but easy-to-use technology” (King 2007: 173). figure 2 shows the evolution of files posted on the site, peaking in 2015, when the social sciences participation in Dataverse catches up with the overall growth trend among all disciplines.13 While we cannot explain the 2015 maximum, we know that the observed trends are driven by changes in the number or files uploaded which account for over 80% of all materials contributed to the site.

Source: own compilation from Dataverse Project's online records.

Figure 2 Number of Dataverses and Dataset files, 2007-2020 

Of a total of 98 journal dataverses found in the Dataverse Project as of December 2020, 29 were in the area of Political Science and International Relations. Table 2 shows the number of records held in the dataverse managed by each of the 29 journals. Keeping in mind that posting replication materials is largely optional to authors, it is revealing to find numerous records by journal, with some leading the trend having started the practice of leaving a digital paper trail of replication materials. It is notorious that in over the years more journals adopt a Dataverse, with 19 uploading more data between 2018 and 2019, and these are journals that publish research coming from different methodological traditions.

Table 2 Number of records held per journal Dataverse 

Journal Dataverse Number of records Coverage
The Journal of Politics 607 2015-2020
International Studies Quarterly 519 2007-2020
American Journal of Political Science 488 2012-2020
Political Analysis 453 2010-2020
British Journal of Political Science 329 2015-2020
Political Science Research and Methods 324 2013-2020
International Interactions (II): Empirical and Theoretical Research in International Relations 285 2010-2020
American Political Science Review 272 2007-2020
Research & Politics 224 2014-2020
Political Behavior 223 2015-2020
State Politics & Policy Quarterly 134 2013-2020
Journal of Experimental Political Science 105 2017-2020
Foreign Policy Analysis 95 2010-2019
Perspectives on Politics 93 2018-2020
Legislative Studies Quarterly 80 2007-2020
Journal of Public Policy 78 2016-2020
Italian Political Science Review 70 2014-2020
Brazilian Political Science Review 61 2018-2020
International Organization 60 2019-2020
PS: Political Science & Politics 44 2007-2020
World Politics 42 2017-2020
Palgrave Communications 37 2015-2020
International Interactions 25 2010
Latin American Politics and Society 22 2016-2020
Japanese Journal of Political Science 22 2019-2020
Journal of Information Technology and Politics 11 2008-2012
The Political Methodologist 8 2016-2018
Politics & Gender 2 2020
The Chinese Journal of International Politics 1 2018

Source: own compilation from records found online at the Dataverse Project.

The Dataverse website's statistics indicate that 45.4% of datasets added pertain to the area of the social sciences. 33.6% of dataverse files pertain to research projects, 31.1% are posted by researchers, 6.2% by research groups, 10.4% belong to organizations and institutions, and 2.2% to journals.

The stats on files downloaded from Dataverse (see figure 3) reveal steady increase in the category. Not only are researchers voluntarily sharing more, but the academic community is also making use of this readily available information. Political Science, as a discipline, has a position of leadership in the process of making replication materials available to the research community.

Source: Dataverse Metrics, https://dataverse.org/metrics.

Figure 3 File Downloads from 2016 to 2020 

As a means of facilitating transparency, open access to replication materials opens the door for the opportunity to scrutinize research and results at a wide scale. As was anticipated by the promoters of this practice, granting access to data and documentation fosters further research. But one may wonder if the activity of reproducing others’ research is left to senior researchers and graduate students willing to police (Laitin 2013) or if there is an actual professional benefit from the activity or a contribution to the frontier of knowledge. Madden, Easley, and Dunn (1995) found that journal editors in the social sciences are less enthusiastic than their counterparts in the natural sciences about publishing replication studies.

What is the payoff of replicating other scholars’ works? How many replication studies actually do get published in peer-reviewed journals? We conducted a simple search on Jstor research articles using the word “replication” for all journals titles listed as International Relations or Political Science from 2010 until 2020. We assume that any replication study contains the word replication somewhere in the article. 2,618 fitted the criteria, although very few of those had any relevance with our search interest. A majority of records consisted of comments on papers and the corresponding authors’ responses and several articles discussing the benefits and challenges of replication practices and not actual replication studies. It therefore appears to be necessary to provide more room for the publication of replication studies, perhaps in special issues as in the example of the Journal of Political Science Education cited above, or in pre-designated sections in journals, or develop journals entirely devoted to the replication enterprise. At present, we observe an increasing supply of replication materials but little incentive to actually do the replication work, which is a time-consuming effort (King 2006). If there are no replicators, the entire replication experience goes adrift, although we would still benefit from having access to the datasets to perform other studies and to teach quantitative methods.

There are other initiatives in the social sciences aimed at addressing the important issues of promoting research transparency, openness, and reproducibility. Since 2012, the Berkeley Initiative for Transparency in the Social Sciences (BITSS - https://www.bitss.org) has been working “to strengthen the integrity of social science research and evidence used for policy-making.” While the site does not offer an archive for data sharing, it works around five goals: “1) Build consensus on key issues facing students, faculty, researchers, funders, journals, and other key partners to be more transparent in the social sciences; 2) Improve our understanding of the problem and build evidence for solutions for increased transparency through long-term study of researcher practices; 3) Increase supply of and access to tools and resources for research transparency, which are necessary precursors for widespread adoption of best practices across the research community; 4) Deliver coursework and change research practices at scale by harnessing the BITSS network of students, academic faculty, and researchers (a “push” mechanism); and 5) Provide recognition and awards for the adoption of behaviors related to research transparency (a “pull” mechanism).”

The Center for Open Science (https://www.cos.io/) has implemented a number of tools and resources aimed at promoting transparency in empirical research. One of those initiatives is the Open Practice Badges (see figure 4), which acknowledges open science practices in terms of data sharing, materials sharing and pre-registration of the study. The badges certify that the contents are accessible and available in a persistent location. Evidence from the experience on the adoption of these badges by the journal Psychological Science suggests that they increase the rate of data sharing (Kidwell et al. 2016). Journal editors choose whether to subscribe to the Open Practice Badges and may assign badges on pre-registration, open access, and data sharing based on the author's disclosure statement or through independent peer review. As of 2020, the American Journal of Political Science and Political Communication are the sole academic publications in our field to have adopted this practice, which was implemented in 2016.14 The badges constitute a quick and efficient way to signal whether information is available, and while journals may not mandate their acquisition in order to be eligible for publication, badges do save time and effort to possible replicators and others interested in accessing the materials for other purposes, such as responding alternative research questions based on that data, learning from the code employed, or including it as replication activities in course syllabi.

Source: Center for Open Science.

Figure 4 Open Practice Badges 

The Center for Open Science administers the Open Science Framework (OSF, osf.io), which we consider to be outstanding software for file sharing, having some important properties to ensure transparency. OSF allows for pre-registering the research plan and stores every new version of the work included. It is a free of charge software that imposes no limit on total storage capacity, although no single file can exceed five gigs. It also works well with Zotero and permits sharing collective projects with contributors, which proves to be very helpful for author's collaborations and for teaching.

IV. A REGIONAL FOCUS: LATIN AMERICA

Much of the discussion that has occupied the discipline over the past two decades, beginning with King's 1995 agenda setting article on the issue of transparency practices in empirical research, is foreign to many scholars working outside of the United States and Europe. Encouraging the adoption of these practices in the discipline in other parts of the world would be consistent with building a global academic community and fostering open science, in particular among those who work with quantitative analysis.

The discipline of Political Science has had a lethargic development in Latin America, having developed strong professional associations over the past decade only. Additionally, a good number of Latin American scholars who receive doctoral training in American universities stay in the United States upon graduation, with the notable exception of Brazilians (Malamud and Freidenberg 2011) – which is probably a multicausal phenomenon, among those job opportunities in the home country. Latin American political scientists compete with other disciplines for journal space in regionally based publications, as they are traditionally interdisciplinary (Narváez-Berthelemot and Russell 2001). Among all Latin American based journals listed on the Journal Citation Report or Scimago indexes, only two -Revista de Ciencia Política and Política y Gobierno- are exclusively focused on political science works (Basabe-Serrano and Huertas-Hernández 2018).

There has been limited progress regarding the availability of replication files in the region. We reviewed all the journals based in Latin American countries that are listed in the Scimago index under the categories of “Political Science and International Relations,” and “Sociology and Political Science.” Of a total of 55 unique records, we found that only six academic journals require replication files, and all but four of them offer open access to the public. These stats offer a clear view of the differing trends that the advancement of replication and open access have in the discipline at the regional level.

Among the 12 journals specialized in Latin American topics that are not based in Latin American countries,15 we found that three explicitly list as a requirement the submission of replication files and seven offer open access to users. A plausible explanation of the limited number of publishing venues that require replication files may be that non-quantitative methods have a strong tradition in Latin America. Hence, it is possible that authors in the quantitative tradition may decide to send their work for publication in journals based in other regions –especially those that publish in languages other than Spanish or Portuguese.16 Indeed, in the 22 journals published in Spanish or Portuguese that are listed in the JCR or Scimago indices from 2011 to 2018, only 33.7 % are either comparative or large N-studies while all the rest are case studies (Basabe-Serrano and Huertas-Hernández 2018).

With regard to the provision of open access to research articles, it is notorious that the subset of journals analyzed overwhelmingly favors posting articles online, free of charge.17 This undoubtedly responds to the journals’ business models but also to a growing consensus about making findings available. For example, the Latin American Council for the Social Sciences (CLACSO) offers a free access digital repository of books and works in the social sciences.18 FLACSO provides an open repository of student's thesis and researcher's books.

The preference for open access at the regional level is crystallized in the Scientific Electronic Library Online SciELO Project (www.scielo.org) index of academic journals, which is a program of the Fundación de Apoyo a la investigación del Estado de São Paulo (FAPESP) dating back to 1999. Most of the Latin American journals that are part of the Web of Science and Scopus indexes are part of SciELO and are peer-reviewed, open access, digital, and free of charge. SciELO works with national SciELO collections that are financed by national research institutions and follow the same standards (Packer et al 2014). There are already 11 Latin American countries fully part of the program and two more countries are in the process of incorporation.19 The overall goal is to increase the quality research and visibility of research produced in Latin America and the Caribbean.

The risks and opportunities of the implementation of research transparency and openness policies in publishing are being discussed in the non-American political science community as well. In Canada, where incentives have already been in place given the nature of state funded research,20 the issue is not so much whether to make replication materials available per se, but on the details involved in the process (Johnson et al. 2017). These authors point to eight concerns, among those that a uniform standard of data access and research transparency may have problems accommodating different epistemological and methodological traditions, the high costs affiliated with storing and translating research documents produced in languages other than Spanish.

Making an analogous effort in anticipating potential issues in implementing open science policies in Latin America, we think that open access is already a consensus. Given the scarcity of specialized journals in political science and the relatively minority of studies being produced employing experimental and observational data, this may be fertile terrain to adopt best practices from the inception. What is more, in our view, the incorporation of transparent research practices in the classroom would contribute to educate a new generation of Latin American researchers who are likely to consolidate these research practices in the profession, as students become researchers, journal editors, and professors.21

V. CONCLUDING REMARKS

The ability of the research community to keep tabs on the origins and use of data in scientific research lags behind the rate at which technology to generate large amounts of data - and papers analyzing these data - are produced. This situation presents both a challenge and an opportunity, especially in regions of the world where the extended use of quantitative analysis in research is still developing. Can the discipline come up with procedures to account for the transparency and reproducibility of analyses able to accommodate different research traditions? We agree with Dunning and Rosenblatt (2016) that advancing the agenda of transparency is feasible with multi-method and qualitative research but recognize that it is not possible to have a unique set of standards to achieve it. Above all, we should recognize the importance of promoting diversity in our profession regarding theoretical debates, expertise, and methods while upholding a shared interest in posing interesting and motivating questions, making compelling arguments, and presenting relevant evidence (Yashar 2016).

In the fragmentary data presented above, we identified certain trends in the practices of the profession. Pre-registration –as a means of minimizing the possibility of data manipulation and publication bias- has been growing over the years but is yet nowhere near a generalized practice. Making datasets and coding available is a much wider practice, done voluntarily by authors with the encouragement of an ever-increasing group of journal editors that adhere to these practices. Granting badges for pre-registration, data sharing and open access has yet to take hold in the discipline, with only two journals currently implementing it. Paradoxically, replication files are increasingly available online and free of charge, while access to actual research articles require a paid subscription.

There still remains much work to be done regarding incentives to publish replications of quantitative works. The evidence points to the fact that a change in the way we do things is taking place, even if it is that we spend time discussing these issues. Much of the current controversies in the discipline revolves around coming up with acceptable journal policies to achieve transparency for all research traditions, under the presumption that norms will shape behaviors. However, we are observing unenforced changes in the patterns of behavior that may ultimately be reflected in rules. Most likely, it will turn out to be an endogenous process, where rules will affect behaviors and vice versa.

An opportunity arises when the dissemination of data can be accompanied with the introduction of protocols and incentives in publication that promote good practices in the use of these data, many of which we have referred to in this paper. We dare suggest two avenues that we think are best suited to achieve these goals: teaching and publishing. Teaching new scholars good practices in empirical research will get them accustomed to working in this manner from the beginning of their careers. Incentives to publish high quality research – be it replications or original articles with or without null results – would help ensure that we are learning from our failures and our successes and that we are in an equilibrium that keeps us all honest.

With regard to the development of open science in Latin America, we have shown that the practice of open access (free of charge) to academic articles is predominant. The fact that political science, as a discipline, is still in a stage of development offers ample opportunity to adopt transparency best practices in empirical research early on. We also encourage wider integration of different parts of the world in the ongoing discussions, given that part of open science is, incidentally, to share globally.

1See Broockman, Kalla, and Aronow (2015) for a comprehensive analysis on the LaCour 2014 data irregularities.

3Trying to abandon the project due to further discussion on implications for qualitative data, confidential data, etc, 625 political scientist signed a petition (Janz 2015). However, some of the top journals made explicit in their guidelines how to handle private and confidential information when dealing with human subjects or other kind on sensitive information.

4We thank an anonymous reviewer for pointing this out.

5What might incentivize journals to publish null results is the transparency involve in the process, so science can evolve and fulfill its purpose. See Shields 2000; Gerber, Green, and Nickerson 2001; Landis et al. 2014; Findley et al. 2016.

8Withdrawal means: “The original article is removed from access on the Journal's publishing platform.” Retrieved on December 31, 2020 from https://retractionwatch.com/retraction-watch-database-user-guide/retraction-watch-database-user-guide-appendix-b-reasons/

9Journals that belong to five commercial publishers have concentrated around 51% of the social sciences share of publications, all of those without open access by default but paying for it (Larivière, Haustein, and Mongeon 2015).

10Given that the publishing companies business models are based on contributions from academic libraries, professional associations and individual's subscription, open access and open science movements are hard to encourage (Rodríguez Medina 2019).

11The journals are: Political Communication, Political Analysis, Journal of Public Administration Research and Theory, and Political Behavior. The latter requires mandatory pre-registration in case of clinical trials.

121) Harvard University, 2) Princeton, 3) Sciences Po, 4) University of Oxford, and 5) London School of Economics and Political Science (LSE).

13After 2015, the correlation between the variables is equal to 0.9.

15These are: Bulletin of Latin American Research, Canadian Journal of Latin American and Caribbean Studies, European Review of Latin American and Caribbean Studies, Journal of Latin American Studies, Latin American Perspectives, Latin American Politics and Society, Latin American Research Review, Asian Journal of Latin American Studies, Journal of Politics in Latin America, América Latina Hoy, Latin America Policy, and Anuario latinoamericano – Ciencias Políticas y Relaciones Internacionales.

16United States and United Kingdom publish altogether more than half of all specialized journals around the world, contributing to the English language hegemony (Rodríguez Medina 2019).

17There may also be legal requirements in some countries to post research results online, as in the case of the 2016 Ecuadorian law on the social economy of knowledge.

19From Red Scielo website. Retrieved on December 31, 2020 from http://www.scielo.org/php/index.php?lang=es

20Canadian tri-agency funding refers to CIHR, NSERC, SSHRC.

21The TIER protocol (Ball and Medeiros 2012) is an excellent resource for implementing replication practices in teaching.

REFERENCES

Ball, Richard, and Norm Medeiros. 2012. “Teaching Integrity in Empirical Research: A Protocol for Documenting Data Management and Analysis.” The Journal of Economic Education 43(2): 182-89. [ Links ]

Basabe-Serrano, Santiago y Sergio Huertas-Hernández. 2018. “El estado de la investigación en ciencia política sobre América Latina.” Revista Española de Ciencia Política 47: 153-70. [ Links ]

Bowers, Jake y Maarten Voors. 2016. “Cómo mejorar su relación con su futuro yo.” Revista de Ciencia Política 36(3), 829-48. [ Links ]

Broockman, David, Joshua Kalla, and Peter Aronow. 2015. “Irregularities in LaCour (2014),” 27. Working paper. Stanford University. Retrieved June 23 from http://stanford.edu/∼dbroock/broockman_kalla_aronow_lg_irregularities.pdf. [ Links ]

Dafoe, Allan. 2014. “Science Deserves Better: The Imperative to Share Complete Replication Files.” PS: Political Science & Politics 47(01): 60-66. [ Links ]

Dunning, Thad and Fernando Rosenblatt. 2019. “Transparency and reproducibility in multi-method research.” Revista de Ciencia Política 36(3), 773-83. [ Links ]

Findley, Michael G., Nathan M. Jensen, Edmund J. Malesky, and Thomas B. Pepinsky. 2016. “Can Results-Free Review Reduce Publication Bias? The Results and Implications of a Pilot Study.” Comparative Political Studies 49(13): 1667-1703. [ Links ]

Foster, Drew. 2015. “Will Academia Waste the Michael LaCour Scandal?” The Cut. Retrieved June 23 from https://www.thecut.com/2015/06/will-academia-waste-the-michael-lacour-scandal.htmlLinks ]

Gerber, Alan S., Donald P. Green, and David Nickerson. 2001. “Testing for Publication Bias in Political Science.” Political Analysis 9(4): 385-92. [ Links ]

Gleditsch, Nils Petter, Claire Metelits, and Havard Strand. 2003. “Posting your data: Will you be scooped or will you be famous.”International Studies Perspectives4(1): 89-97. [ Links ]

Humphreys, Macartan, Raul Sanchez de la Sierra, and Peter van der Windt. 2013. “Fishing, Commitment, and Communication: A Proposal for Comprehensive Nonbinding Research Registration.” Political Analysis 21(1): 1-20. [ Links ]

Isaac, Jeffrey C. 2015. “For a More Public Political Science.” Perspectives on Politics 13 (2): 269-83. [ Links ]

Janz, Nicole. 2015. “Political Scientists Trying to Delay Research Transparency.” Political Science Replication (blog). Retrieved June 23 from https://politicalsciencereplication.wordpress.com/2015/11/07/political-scientists-trying-to-delay-research-transparency/. [ Links ]

Johnson, Genevieve Fuji, Mark Pickup, Eline A. de Rooij, and Rémi Léger. 2017. “Research Openness in Canadian Political Science: Toward an Inclusive and Differentiated Discussion.” Canadian Journal of Political Science 50(1): 311-28. [ Links ]

Kidwell, Mallory C., Ljiljana B. Lazarević, Erica Baranski, Tom E. Hardwicke, Sarah Piechowski, Lina-Sophia Falkenberg, Curtis Kennett, et al. 2016. “Badges to Acknowledge Open Practices: A Simple, Low-Cost, Effective Method for Increasing Transparency.” PLOS Biology 14 (5): e1002456. [ Links ]

King, Gary. 1995. “Replication, Replication.” PS: Political Science and Politics 28(3): 444. [ Links ]

King, Gary. 2006. “Publication, Publication.” PS: Political Science & Politics 39(01): 119-25. [ Links ]

King, Gary. 2007. “An Introduction to the Dataverse Network as an Infrastructure for Data Sharing.” Sociological Methods & Research 36(2): 173-99. [ Links ]

Laitin, David D. 2013. “Fisheries Management.” Political Analysis 21(1): 42-47. [ Links ]

Landis, Ronald S., Lawrence R. James, Charles E. Lance, Charles A. Pierce, y Steven G. Rogelberg. 2014. “When Is Nothing Something? Editorial for the Null Results Special Issue of Journal of Business and Psychology.” Journal of Business and Psychology 29(2): 163-67. [ Links ]

Larivière, Vincent, Stefanie Haustein, y Philippe Mongeon. 2015. “The Oligopoly of Academic Publishers in the Digital Era.” PLOS ONE 10 (6): e0127502. [ Links ]

Lenine, Enzo and Melina Mörschbächer. 2019. “La iniciativa DA-RT en la ciencia política estadounidense: discursos acerca de una política de transparencia y acceso a datos”.” Revista mexicana de ciencias políticas y sociales 64(235): 109-137. [ Links ]

Madden, Charles S., Richard W. Easley, and Mark G. Dunn. 1995. “How Journal Editors View Replication Research.” Journal of Advertising 24(4): 77-87. [ Links ]

Malamud, Andrés, y Flavia Freidenberg. 2011. “La diáspora rioplatense: presencia e impacto de los politólogos argentinos, brasileños y uruguayos en el exterior”. En Más allá de la fuga de cerebros. Movilidad, migración y diáspora de los argentinos calificados, editado por Lucas Luchilo. Buenos Aires: Eudeba. [ Links ]

Monogan, James E. 2013. “A Case for Registering Studies of Political Outcomes: An Application in the 2010 House Elections.” Political Analysis 21(1): 21-37. [ Links ]

Monogan, James E. 2014. “The pros of preregistration for political science.” Oxford University Press blog (blog). Retrieved June 23 from https://blog.oup.com/2014/09/pro-con-research-preregistration/. [ Links ]

Narvaez-Berthelemot, Nora, y Jane M. Russell. 2001. “World Distribution of Social Science Journals. A View from the Periphery.” Scientometrics 51(1): 223-39. [ Links ]

Nyhan, Brendan. 2015. “Increasing the Credibility of Political Science Research: A Proposal for Journal Reforms.” PS: Political Science & Politics 48(S1): 78-83. [ Links ]

Packer, A.L., Nicholas Cop, Adriana Luccisano, Amanda Ramalho, and Ernesto Spinak. 2014. SciELO: 15 Años de Acceso Abierto: un estudio analítico sobre Acceso Abierto y communicación científica. UNESCO Publishing. [ Links ]

Rodríguez Medina, Leandro. 2019. “A Geopolitics of Open Access: Information, Software and Reading.” Estudios Sociológicos 37(111): 727-55 [ Links ]

Shields, Peter. 2000. “Publication Bias Is a Scientific Problem with Adverse Ethical Outcomes: The Case for a Section for Null Results.” Cancer Epidemiology, Biomarkers & Prevention 9: 771-772 [ Links ]

Silva, Pedro. 2019. “Malpractice in the Publication of Political Science Experiments: An investigation using Covariate Balance data.” [Paper Session] 77th Annual MPSA Conference, Chicago, Illinois. https://convention2.allacademic.com/one/mpsa/mpsa19/index.php?cmd=Online+Program+View+Paper&selected_paper_id=1482264&PHPSESSID=4ncj5jl997dci7fuefg2tb2djmLinks ]

Simmons, Joseph P., Leif D. Nelson, and Uri Simonsohn. 2011. “False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant.” Psychological Science 22 (11): 1359-66. [ Links ]

Tripp, Aili Mari. 2018. “Transparency and Integrity in Conducting Field Research on Politics in Challenging Contexts.” Perspectives on Politics 16(3): 728-38. [ Links ]

Tucker, Joshua. 2014. “Experiments, preregistration, and journals.” The pros and cons of research preregistration (blog). Retrieved June 23 from https://blog.oup.com/2014/09/pro-con-research-preregistration/Links ]

Yashar, Deborah J. 2016. “Editorial Trust, Gatekeeping, and Unintended Consequences.” Comparative Politics Newsletter 26(1): 57–64. [ Links ]

Young, Joseph K. and Nicole Janz. 2015. “What Social Science Can Learn From the LaCour Scandal.” The Chronicle of Higher Education. Retrieved June 23 from https://www.chronicle.com/article/what-social-science-can-learn-from-the-lacour-scandal/Links ]

Received: October 25, 2019; Accepted: April 17, 2021

Creative Commons License This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License, which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.