I. INTRODUCTION
The preoccupation to raise our disciplinary standards and adopt best practices to ensure transparency in empirical research has been an ongoing conversation that dates back to Gary King's (1995) landmark article on replication. Contrary to what has occurred in other disciplines, political science had not undergone major crisis regarding the credibility of research; neither had there been cases of fraud or journal article retractions until 2015. The LaCour scandal changed that path and introduced our science into a new discussion (Foster 2015; Young and Janz 2015; Findley et al. 2016).1 Scholars in the social sciences have produced, in the past decade, a number of articles that address the need for more transparent empirical research practices (see, for example, Bowers and Voors 2016), in particular those that prevent p-fishing and hacking and that allow for replication. Progress has been made in regard to estimating the magnitude of the problem, even when one departs from the assumption that researchers seek to produce relevant, solid work - which is also publishable. For example, Simmons, Nelson, and Simonsohn show “how unacceptably easy it is to accumulate (and report) statistically significant evidence for a false hypothesis” (2011: 1359).
The American Political Science Association (APSA) has taken a leading role in promoting the incorporation of transparency in research practices into the profession. In 2012 the APSA Council amended its Guide to Professional Ethics in Political Science to incorporate Data Access and Research Transparency (DA-RT) principles that had been prepared by an ad-hoc committee. Additionally, 27 journals have signed a 2014 statement on transparency (JETS)2 committing themselves to take specific steps in order to achieve transparency by January 15, 2016 - including requiring authors to make data available. A number of symposia, panel discussions, and journal special issues on the topic continued the discussion given that there were many concerns about how this would be implemented and how it would affect the costs and burden of doing research.3 A line of divide was the methodological approach employed, as the feasibility and costs of sharing data and metadata could vary significantly depending on the type of study. This methodological divide resulted in two separate drafts for transparency depending on whether the study was qualitative or quantitative – an old divide in the discipline (Guidelines for Data Access and Research Transparency for Qualitative Research in Political Science and Guidelines for Data Access and Research Transparency for Quantitative Research in Political Science). Others proposed that the problem had a heuristic nature, being at the core of the issue whether the approach employed in research was positivist or non-positivist (see, for example, Isaac 2015). These are undoubtedly important and controversial issues that have been discussed at length elsewhere and do not constitute the core of this piece.
However, there are two specific topics that are relevant for this paper which are part of the discussions that divide the discipline. First, the meaning of transparency may not be the same in the quantitative and qualitative research traditions, as APSA's Qualitative Transparency Deliberations have made clear. While it may be feasible for quantitative researchers to share datasets and files after publication, a qualitative researcher may be bound by ethical and legal considerations in deciding whether to publicly expose transcripts of interviews in complex contexts, on how to share information about the interpersonal nature of field research (see Tripp (2018) for a detailed analysis). Additionally, the debate over the potential adoption of universal research practices has led to rethinking the meaning of producing knowledge and what our role as political scientists is (Lenine and Mörschbächer 2019).
The second crucial topic has to do with the desirability and feasibility of replication. There is an important distinction between “reproducibility” and “replication.”4 While political scientists share and interest in producing quality research, sometimes replication is simply impossible: How can the experience of ethnography be replicated? Would doing so would even make sense? Perhaps it would be important to consider an approach that focuses on the rigor of the process of documenting field research or collecting data in general instead of imposing a unique standard for all. Replication seems to be most suitable for quantitative research which adds value to the enterprise of teaching research methods and helps correct for potential errors that may intentionally or unintentionally affect the integrity of research. This being said, we recognize that the information that we are about to systematize tend to refer to a narrow understanding of transparency and reproducibility (as replication) that better relate to quantitative analysis; however, this is not the authors’ bias, but a trend that is indicative of the status quo. We believe that having information on compliance on best practices, even if they are narrowly understood, is a first step towards disciplining our thoughts and organizing future discussions.
The increasing use of big data in Political Science applications make the process of building best research practices even more relevant. The application of geographically referenced data in the social sciences is also in an upward trend and opens up new avenues for research. Including space distribution as a new dimension of analysis has proven most helpful in electoral studies but also in sociological studies of urban-rural distribution of populations, while at the same time taking into consideration other aspects of the social and political phenomena in question. Hence, the protocols on how we process and analyze data are crucial to producing solid, credible research.
In this article we attempt to contribute to the ongoing discussion in the social sciences regarding the important issues of open science, replication, and pre-registration practices by looking into how these practices have been implemented in the past few years, drawing on data available online from different sources. Unfortunately, the data has proven to be scarce, as will be shown below, but we were able to learn about trends in the changing practices of the political science field.
The article proceeds as follows. Section II focuses on pre-registration practices in the field and tracks withdrawn articles from journals. The following section reviews a selection of journal requirements regarding the submission of replication files, offers a preview of voluntary posting of replication files at major American universities, and presents stats on journal's open access policies. Section IV provides a regional focus of publishing trends in Latin American based journals.
II. PRE-REGISTRATION PRACTICES AND WITHDRAWN ARTICLES
Humphreys, Sanchez de la Sierra, and van der Windt (2013) have made a compelling case for nonbinding research registration of experimental and observational work. They are particularly concerned about the problem of fishing and warn that the results that do get reported may induce bias, as researchers have incentives to show positive results selecting models that produce p-values suitable for publication. Incentives to obtain positive results are modeled both by the publishing practices and standards, and also by the decisions that researchers make - even when well-intentioned - in order to fall into the acceptable results for publication (i.e., Gerber, Green, and Nickerson 2001). Under the maxim “publish or perish,” researchers, editors and reviewers set standards that may not always produce the most desirable outcomes. Silva (2019) cautions about possible malpractices in publishing decisions of Political Science randomized trials based on whether covariates are balanced, suggesting that overzealous reviewers and researcher's ex-post data manipulation affect the production of knowledge in the discipline.
One of the main benefits of pre-registration is that of reducing publication bias, considering that: 1) the academic community learns about a well-designed study regardless of the null hypothesis significance tests results, 2) a registration process would allow scholars to submit high quality articles with null results for publication,5 3) offers in advance a well-reasoned criteria to stop data collection, and 4) registering a model specification prior to conducting the empirical analysis offers a most honest and theory-grounded method for hypothesis testing (Monogan 2013, 2014). Another of the many possible sources of publication bias is the impact that the institutional or country affiliations of authors may have on editors, which is why Nyhan (2015) proposed a triple-blind reviewing process, where the identity of authors is concealed even to editors while the article is being evaluated.
An extended and rigid implementation of these practices, however, may prevent a researcher from making legitimate adjustments to the study or sample size that may be appropriate while conducting research (Tucker 2014). Who draws the line between appropriate deviations from a pre-registered design and practices that are related to p-fishing? Citing that political science research fluctuates between deduction and induction and the importance of conditions that favor one or another theoretical proposal, Laitin (2013) argues in favor of promoting the publication of replication studies and studies with null findings instead of focusing on a discipline-wide registry. And perhaps we are moving towards accepting that knowledge of failed experiments may teach us something. On August 1, 2018, the editors of the Journal of Political Science Education invited submission for a special issue revolving around ideas that did not meet expectations, citing that “while negative results rarely get reported in academia, they are even more important when it comes to how we teach our students.”6
Pre-registration of hypothesis and research design in general is largely voluntary in Political Science. We checked the Political Science Registered Studies Dataverse (https://dataverse.harvard.edu/dataverse/registration) in order to get an idea of how the practice of pre-registering articles is being implemented in the field. This registry contains information ranging from 2009 onwards. As table 1 shows, the number of pre-registered articles has tended to increase during the period, reaching a maximum in 2018. Although the past few years have shown an increase in use of the registry, the number of pre-registered studies remains remarkably low.
Table 1 Pre-registered studies in Political Science Registered Studies Dataverse
Year | Number of studies |
---|---|
2009 | 1 |
2010 | 0 |
2011 | 1 |
2012 | 1 |
2013 | 0 |
2014 | 0 |
2015 | 1 |
2016 | 9 |
2017 | 13 |
2018 | 21 |
2019 | 9 |
2020 | 8 |
Source: Own compilation from data in the Political Science Registered Studies Dataverse.
Among a total of 64 pre-registered articles, 14 were published. 19 of those articles are authored by scholars based in European institutions, 37 are based in the United States and the remaining eight are the result of partnerships across institutions that include at least a European, an American, or a Latin America partner. Hence, it appears that the use of the Political Science Registered Studies Dataverse is yet limited but predominantly used by American-based scholars.
As figure 1 above shows, out of the 64 studies registered, 11 were on classical topics in American Politics, 15 on Public Policy, nine referred to International Relations subject matters, eight to Political Psychology, seven to Political Methodology, five to Public Opinion and the other nine were inscribed in the areas of elections, comparative politics, or political economy. In Dataverse, 23 of the 64 registered studies were experiments in different subfields of the discipline: two of them were published, but they were only registered between 2016 and 2020. As of December 2020, the Open Science Framework reported over 62,625 pre-registration records, from different fields (and quite possibly many are from students taking quantitative courses); it was not possible to track down fields or whether they were eventually published.7

Source: Own compilation from data in the Political Science Registered Studies Dataverse.
Figure 1 Pre-registered studies by subfield, 2009-2020
While adopting voluntary pre-registration practices appears to be a slow process, coming up with a minimum consensus that can be translated into publishing requirements appears to be more challenging. If it were possible to coordinate on appropriate standards for pre-registration, eventually the practices could become the norm. Developing standards that incorporate changing contexts and research topic specificities may allow for certain flexibility without falling into the trap of masking inductive research as deductive research. Nevertheless, the lack of incentives to do so may continue to delay the process.
Withdrawn Articles in the Field
New online initiatives attempt to keep a record of articles that journals withdraw. The websites: https://www.bitss.org/ and https://retractionwatch.com/ post articles on this subject, covering a wide range of subjects. We inspected the contents of the web site and were able to find information on retracted articles in our discipline as of December 2020. We found seven withdrawn articles8 from the field of political science. The reasons for withdrawal were: limited or no information (three), plagiarism (two), concerns about the data and the results (one), and we could not find the journal's justification for the removal in the remaining case. While there is no comprehensive registry, the practice of making public an editorial decision to remove articles is slowly becoming more common.
III. REPLICATION REQUIREMENTS AND OPEN ACCESS
Dating back to the 1995 PS: Political Science and Politics issue on replication in the social sciences, the case for transparency in sharing data and documentation has gathered widespread support in the discipline, particularly among researchers who work with large N data. Allan Dafoe (2014: 63) recommended the adoption of the following transparency maxim: “Good research involves publishing complete replication files, making every step of research as explicit and reproducible as is practical.” Among the advantages of sharing replication files are that papers receive more citations and are more visible (Gleditsch, Metelits, and Strand 2003), although it could be argued that prominence is the result of high quality, which may be merely correlated with offering replication files. While the desirability of achieving higher standards of transparency has not been questioned, a number of researchers have pointed to the inability to share proprietary data or reveal the identity of informants, and other specific cases that would require flexibility in the criteria employed to enforce access to information needed to replicate studies.
The top 20 Political Science journals, as per the 2020 Journal Citation Report ranking, fluctuate from the score 5.912 to 3.069, with Political Communication ranked at the top 65% of these journals are published quarterly, and 19 published unsolicited articles, while the Annual Review of Political Science is the only one that publishes commissioned articles. The author guidelines of these top 20 journals revealed that:
15 journals explicitly ask authors to submit the databases employed along with the manuscript. In some cases, publication is conditional on compliance with this requirement.
14 require sending replication files to reproduce results. Three of those journals provide a detailed list of the types of files that authors must submit: 1) an archive describing all the files sent (read me), 2) a file containing the databases or other representations of information linked to the study, 3) a file indicating the software employed, the code used if it were the case, and the files with the coding or syntax to replicate the results.
Six do not provide specifications regarding replication files.
Hence, about 70% of the top 20 journals in the field have rules on how authors must submit replication files from which analyses may be reproduced.
With regard to Open Access, 19 journals established that allowing for open access is up to the authors of the article; open access options are based on the journals’ different types of subscription contracts.9 10 As has been highlighted elsewhere, four of the top 20 ranked political science journals that recommends - although do not require - the preregistration of hypothesis.11 Transparency and open access may, however, have contradictory trajectories of evolution. While replication materials are increasingly available to the general public free of charge, the actual article that is published may be subject to journal subscription-only access (such as Jstor or Sage).
Best practices indicate that novel data is usually available to the research community about a year after it was originally published. It is very possible that, even when peer-reviewed journals may not require the submission of replication materials, scholars voluntarily do so on their academic web pages. Hence, we looked into the pages of full-time professors based at the top five Political Science and International Studies departments according to the QS 2020 ranking.12 Out of a total of 245 faculty members, we found that 23.3% posted at least some datasets employed in their publications and 70.2% gave access to at least one of their articles and/or drafts. An interested reader could email a scholar in order to request a dataset, but of course the outcome of this process is not easily observed. All that can be said is that paper drafts tend to be available online while datasets are much less frequently found online, and even less so the software coding needed to replicate a published paper.
There are interdisciplinary initiatives to replicate published studies, most notably ReplicationWiki (http://replication.uni-goettingen.de/wiki/index.php/Main_Page), which hosts a number of replication works in the field of Economics. We draw information on another initiative, the Dataverse Project at https://dataverse.harvard.edu, which is an open-source research data repository software with 69 installations around the world hosting over 4,444 dataverses (defined as sets of data, documentation, and metadata) and over 106,350 Datasets as of December 2020. The project aims to “facilitate the public distribution of persistent, authorized, and verifiable data, with powerful but easy-to-use technology” (King 2007: 173). figure 2 shows the evolution of files posted on the site, peaking in 2015, when the social sciences participation in Dataverse catches up with the overall growth trend among all disciplines.13 While we cannot explain the 2015 maximum, we know that the observed trends are driven by changes in the number or files uploaded which account for over 80% of all materials contributed to the site.

Source: own compilation from Dataverse Project's online records.
Figure 2 Number of Dataverses and Dataset files, 2007-2020
Of a total of 98 journal dataverses found in the Dataverse Project as of December 2020, 29 were in the area of Political Science and International Relations. Table 2 shows the number of records held in the dataverse managed by each of the 29 journals. Keeping in mind that posting replication materials is largely optional to authors, it is revealing to find numerous records by journal, with some leading the trend having started the practice of leaving a digital paper trail of replication materials. It is notorious that in over the years more journals adopt a Dataverse, with 19 uploading more data between 2018 and 2019, and these are journals that publish research coming from different methodological traditions.
Table 2 Number of records held per journal Dataverse
Journal Dataverse | Number of records | Coverage |
---|---|---|
The Journal of Politics | 607 | 2015-2020 |
International Studies Quarterly | 519 | 2007-2020 |
American Journal of Political Science | 488 | 2012-2020 |
Political Analysis | 453 | 2010-2020 |
British Journal of Political Science | 329 | 2015-2020 |
Political Science Research and Methods | 324 | 2013-2020 |
International Interactions (II): Empirical and Theoretical Research in International Relations | 285 | 2010-2020 |
American Political Science Review | 272 | 2007-2020 |
Research & Politics | 224 | 2014-2020 |
Political Behavior | 223 | 2015-2020 |
State Politics & Policy Quarterly | 134 | 2013-2020 |
Journal of Experimental Political Science | 105 | 2017-2020 |
Foreign Policy Analysis | 95 | 2010-2019 |
Perspectives on Politics | 93 | 2018-2020 |
Legislative Studies Quarterly | 80 | 2007-2020 |
Journal of Public Policy | 78 | 2016-2020 |
Italian Political Science Review | 70 | 2014-2020 |
Brazilian Political Science Review | 61 | 2018-2020 |
International Organization | 60 | 2019-2020 |
PS: Political Science & Politics | 44 | 2007-2020 |
World Politics | 42 | 2017-2020 |
Palgrave Communications | 37 | 2015-2020 |
International Interactions | 25 | 2010 |
Latin American Politics and Society | 22 | 2016-2020 |
Japanese Journal of Political Science | 22 | 2019-2020 |
Journal of Information Technology and Politics | 11 | 2008-2012 |
The Political Methodologist | 8 | 2016-2018 |
Politics & Gender | 2 | 2020 |
The Chinese Journal of International Politics | 1 | 2018 |
Source: own compilation from records found online at the Dataverse Project.
The Dataverse website's statistics indicate that 45.4% of datasets added pertain to the area of the social sciences. 33.6% of dataverse files pertain to research projects, 31.1% are posted by researchers, 6.2% by research groups, 10.4% belong to organizations and institutions, and 2.2% to journals.
The stats on files downloaded from Dataverse (see figure 3) reveal steady increase in the category. Not only are researchers voluntarily sharing more, but the academic community is also making use of this readily available information. Political Science, as a discipline, has a position of leadership in the process of making replication materials available to the research community.
As a means of facilitating transparency, open access to replication materials opens the door for the opportunity to scrutinize research and results at a wide scale. As was anticipated by the promoters of this practice, granting access to data and documentation fosters further research. But one may wonder if the activity of reproducing others’ research is left to senior researchers and graduate students willing to police (Laitin 2013) or if there is an actual professional benefit from the activity or a contribution to the frontier of knowledge. Madden, Easley, and Dunn (1995) found that journal editors in the social sciences are less enthusiastic than their counterparts in the natural sciences about publishing replication studies.
What is the payoff of replicating other scholars’ works? How many replication studies actually do get published in peer-reviewed journals? We conducted a simple search on Jstor research articles using the word “replication” for all journals titles listed as International Relations or Political Science from 2010 until 2020. We assume that any replication study contains the word replication somewhere in the article. 2,618 fitted the criteria, although very few of those had any relevance with our search interest. A majority of records consisted of comments on papers and the corresponding authors’ responses and several articles discussing the benefits and challenges of replication practices and not actual replication studies. It therefore appears to be necessary to provide more room for the publication of replication studies, perhaps in special issues as in the example of the Journal of Political Science Education cited above, or in pre-designated sections in journals, or develop journals entirely devoted to the replication enterprise. At present, we observe an increasing supply of replication materials but little incentive to actually do the replication work, which is a time-consuming effort (King 2006). If there are no replicators, the entire replication experience goes adrift, although we would still benefit from having access to the datasets to perform other studies and to teach quantitative methods.
There are other initiatives in the social sciences aimed at addressing the important issues of promoting research transparency, openness, and reproducibility. Since 2012, the Berkeley Initiative for Transparency in the Social Sciences (BITSS - https://www.bitss.org) has been working “to strengthen the integrity of social science research and evidence used for policy-making.” While the site does not offer an archive for data sharing, it works around five goals: “1) Build consensus on key issues facing students, faculty, researchers, funders, journals, and other key partners to be more transparent in the social sciences; 2) Improve our understanding of the problem and build evidence for solutions for increased transparency through long-term study of researcher practices; 3) Increase supply of and access to tools and resources for research transparency, which are necessary precursors for widespread adoption of best practices across the research community; 4) Deliver coursework and change research practices at scale by harnessing the BITSS network of students, academic faculty, and researchers (a “push” mechanism); and 5) Provide recognition and awards for the adoption of behaviors related to research transparency (a “pull” mechanism).”
The Center for Open Science (https://www.cos.io/) has implemented a number of tools and resources aimed at promoting transparency in empirical research. One of those initiatives is the Open Practice Badges (see figure 4), which acknowledges open science practices in terms of data sharing, materials sharing and pre-registration of the study. The badges certify that the contents are accessible and available in a persistent location. Evidence from the experience on the adoption of these badges by the journal Psychological Science suggests that they increase the rate of data sharing (Kidwell et al. 2016). Journal editors choose whether to subscribe to the Open Practice Badges and may assign badges on pre-registration, open access, and data sharing based on the author's disclosure statement or through independent peer review. As of 2020, the American Journal of Political Science and Political Communication are the sole academic publications in our field to have adopted this practice, which was implemented in 2016.14 The badges constitute a quick and efficient way to signal whether information is available, and while journals may not mandate their acquisition in order to be eligible for publication, badges do save time and effort to possible replicators and others interested in accessing the materials for other purposes, such as responding alternative research questions based on that data, learning from the code employed, or including it as replication activities in course syllabi.
The Center for Open Science administers the Open Science Framework (OSF, osf.io), which we consider to be outstanding software for file sharing, having some important properties to ensure transparency. OSF allows for pre-registering the research plan and stores every new version of the work included. It is a free of charge software that imposes no limit on total storage capacity, although no single file can exceed five gigs. It also works well with Zotero and permits sharing collective projects with contributors, which proves to be very helpful for author's collaborations and for teaching.
IV. A REGIONAL FOCUS: LATIN AMERICA
Much of the discussion that has occupied the discipline over the past two decades, beginning with King's 1995 agenda setting article on the issue of transparency practices in empirical research, is foreign to many scholars working outside of the United States and Europe. Encouraging the adoption of these practices in the discipline in other parts of the world would be consistent with building a global academic community and fostering open science, in particular among those who work with quantitative analysis.
The discipline of Political Science has had a lethargic development in Latin America, having developed strong professional associations over the past decade only. Additionally, a good number of Latin American scholars who receive doctoral training in American universities stay in the United States upon graduation, with the notable exception of Brazilians (Malamud and Freidenberg 2011) – which is probably a multicausal phenomenon, among those job opportunities in the home country. Latin American political scientists compete with other disciplines for journal space in regionally based publications, as they are traditionally interdisciplinary (Narváez-Berthelemot and Russell 2001). Among all Latin American based journals listed on the Journal Citation Report or Scimago indexes, only two -Revista de Ciencia Política and Política y Gobierno- are exclusively focused on political science works (Basabe-Serrano and Huertas-Hernández 2018).
There has been limited progress regarding the availability of replication files in the region. We reviewed all the journals based in Latin American countries that are listed in the Scimago index under the categories of “Political Science and International Relations,” and “Sociology and Political Science.” Of a total of 55 unique records, we found that only six academic journals require replication files, and all but four of them offer open access to the public. These stats offer a clear view of the differing trends that the advancement of replication and open access have in the discipline at the regional level.
Among the 12 journals specialized in Latin American topics that are not based in Latin American countries,15 we found that three explicitly list as a requirement the submission of replication files and seven offer open access to users. A plausible explanation of the limited number of publishing venues that require replication files may be that non-quantitative methods have a strong tradition in Latin America. Hence, it is possible that authors in the quantitative tradition may decide to send their work for publication in journals based in other regions –especially those that publish in languages other than Spanish or Portuguese.16 Indeed, in the 22 journals published in Spanish or Portuguese that are listed in the JCR or Scimago indices from 2011 to 2018, only 33.7 % are either comparative or large N-studies while all the rest are case studies (Basabe-Serrano and Huertas-Hernández 2018).
With regard to the provision of open access to research articles, it is notorious that the subset of journals analyzed overwhelmingly favors posting articles online, free of charge.17 This undoubtedly responds to the journals’ business models but also to a growing consensus about making findings available. For example, the Latin American Council for the Social Sciences (CLACSO) offers a free access digital repository of books and works in the social sciences.18 FLACSO provides an open repository of student's thesis and researcher's books.
The preference for open access at the regional level is crystallized in the Scientific Electronic Library Online SciELO Project (www.scielo.org) index of academic journals, which is a program of the Fundación de Apoyo a la investigación del Estado de São Paulo (FAPESP) dating back to 1999. Most of the Latin American journals that are part of the Web of Science and Scopus indexes are part of SciELO and are peer-reviewed, open access, digital, and free of charge. SciELO works with national SciELO collections that are financed by national research institutions and follow the same standards (Packer et al 2014). There are already 11 Latin American countries fully part of the program and two more countries are in the process of incorporation.19 The overall goal is to increase the quality research and visibility of research produced in Latin America and the Caribbean.
The risks and opportunities of the implementation of research transparency and openness policies in publishing are being discussed in the non-American political science community as well. In Canada, where incentives have already been in place given the nature of state funded research,20 the issue is not so much whether to make replication materials available per se, but on the details involved in the process (Johnson et al. 2017). These authors point to eight concerns, among those that a uniform standard of data access and research transparency may have problems accommodating different epistemological and methodological traditions, the high costs affiliated with storing and translating research documents produced in languages other than Spanish.
Making an analogous effort in anticipating potential issues in implementing open science policies in Latin America, we think that open access is already a consensus. Given the scarcity of specialized journals in political science and the relatively minority of studies being produced employing experimental and observational data, this may be fertile terrain to adopt best practices from the inception. What is more, in our view, the incorporation of transparent research practices in the classroom would contribute to educate a new generation of Latin American researchers who are likely to consolidate these research practices in the profession, as students become researchers, journal editors, and professors.21
V. CONCLUDING REMARKS
The ability of the research community to keep tabs on the origins and use of data in scientific research lags behind the rate at which technology to generate large amounts of data - and papers analyzing these data - are produced. This situation presents both a challenge and an opportunity, especially in regions of the world where the extended use of quantitative analysis in research is still developing. Can the discipline come up with procedures to account for the transparency and reproducibility of analyses able to accommodate different research traditions? We agree with Dunning and Rosenblatt (2016) that advancing the agenda of transparency is feasible with multi-method and qualitative research but recognize that it is not possible to have a unique set of standards to achieve it. Above all, we should recognize the importance of promoting diversity in our profession regarding theoretical debates, expertise, and methods while upholding a shared interest in posing interesting and motivating questions, making compelling arguments, and presenting relevant evidence (Yashar 2016).
In the fragmentary data presented above, we identified certain trends in the practices of the profession. Pre-registration –as a means of minimizing the possibility of data manipulation and publication bias- has been growing over the years but is yet nowhere near a generalized practice. Making datasets and coding available is a much wider practice, done voluntarily by authors with the encouragement of an ever-increasing group of journal editors that adhere to these practices. Granting badges for pre-registration, data sharing and open access has yet to take hold in the discipline, with only two journals currently implementing it. Paradoxically, replication files are increasingly available online and free of charge, while access to actual research articles require a paid subscription.
There still remains much work to be done regarding incentives to publish replications of quantitative works. The evidence points to the fact that a change in the way we do things is taking place, even if it is that we spend time discussing these issues. Much of the current controversies in the discipline revolves around coming up with acceptable journal policies to achieve transparency for all research traditions, under the presumption that norms will shape behaviors. However, we are observing unenforced changes in the patterns of behavior that may ultimately be reflected in rules. Most likely, it will turn out to be an endogenous process, where rules will affect behaviors and vice versa.
An opportunity arises when the dissemination of data can be accompanied with the introduction of protocols and incentives in publication that promote good practices in the use of these data, many of which we have referred to in this paper. We dare suggest two avenues that we think are best suited to achieve these goals: teaching and publishing. Teaching new scholars good practices in empirical research will get them accustomed to working in this manner from the beginning of their careers. Incentives to publish high quality research – be it replications or original articles with or without null results – would help ensure that we are learning from our failures and our successes and that we are in an equilibrium that keeps us all honest.
With regard to the development of open science in Latin America, we have shown that the practice of open access (free of charge) to academic articles is predominant. The fact that political science, as a discipline, is still in a stage of development offers ample opportunity to adopt transparency best practices in empirical research early on. We also encourage wider integration of different parts of the world in the ongoing discussions, given that part of open science is, incidentally, to share globally.