Replication Research is a peer-reviewed journal for replications and reproductions from various disciplines. Articles need to disclose what original study and hypothesis/claim they replicated or reproduced. For an up-to-date overview of the disciplines from which we accept submissions, please see the editorial board. Generally, we can provide adequate quality assurance and peer review for cognitive, behavioral, and social science studies (psychology, sociology, education sciences, political sciences, management, medicine, life sciences).
Please note that the entire review process is open and reviews will be permanently linked with submitted manuscripts via their DOIs and Pubpeer.com regardless whether the article is accepted or rejected. This is so that quality assurance is transparent, and so that work provided by reviewers for Replication Research is credited and not discarded.
Authors seeking guidance regarding reproduction and replication studies can send a pre-submission inquiry to contact@replicationresearch.org or consult FORRT's handbook for reproduction and replication studies (https://forrt.org/replication_handbook). For submissions, authors can optionally use our cover letter template.
Articles submitted to Replication Research need to investigate a research question that has been previously investigated in a published study or discuss replication methods that are relevant for at least two different research areas. Empirical reports can be computational reproductions using the same data and code, robustness checks using the same data but different procedures, close replications using new data and the same method, or conceptual replications using new data and a different method. Replication closeness needs to be described in detail.
Badges
For articles meeting the requirements listed in the Author Guidelines, we assign the following badges (images provided by OSF and CODECHECK):
For reproductions (i.e., studies where no new data is collected), we recommend using the Institute for Replication’s template available via this OSF Project or FORRT’s Standardized Reproduction Template (StaRT, long or short form). We do not accept reproductions with overlapping authorship with the authorship team of the original article. Reproductions should expand the original analyses in some way (e.g., generalizability or robustness checks). Numerical reproductions (e.g., rerunning the code) are not accepted. Authors can propose results-blind peer-review for reproduction studies via an informal request in the submission’s cover letter.
Replication studies test a previously published claim or hypothesis using different data from the original study. They can be internal (i.e., by the same group of researchers), close, or conceptual (for a typology, see Hüffmeier et al., 2016). Authors can use their own format or a standardized template provided by Replication Research. This Standardized Replication Template (StaRT) is available online at https://osf.io/3jgxd. Replication articles should provide subjective author assessment and guideline-based assessment of success (for guidance, see our Handbook). If possible, authors should also include a reproduction of the original study. We generally recommend reproduction before replication.
Upon acceptance, we expect authors to enter their replication study into the FORRT Replication Database (if it is not included yet; form).
There are many paths from raw data to results. Approaches that aim to explore most or all of these paths and compare how robust a finding is to analytic decisions, among other factors, are called multiverse or specification curve analyses; Approaches that systematically vary choices regarding analysis, variable and sample selection approaches that involve many people choosing their preferred path of data analysis independently and comparing them are called Many-Analyst Studies. Both contain information about robustness or generalizability and are thus an integral part of repetitive research.
For multi-study articles, a mix of replications and reproductions is possible. Authors need to disclose for each study whether it is a replication or reproduction. A mix of original studies and replication is not possible.
Replication Research accepts registered reports for studies where new data is collected. More information on registered reports is available in the guide to registered reports.
Replication Reviews can be meta-analyses and reviews that focus on replications and reproductions (e.g., to aggregate the state of replicability across a field of literature). For meta-analyses of replications, we recommend including a meta-analysis of the original effects which is being replicated alongside the meta-analysis of replications, and eventually a pooled meta-analysis of both original and replication studies. Replications or reproductions of meta-analyses should be submitted to the Replications or Reproductions section.
We do not accept articles that estimate replicability across a body of original literature statistically, using methods such as z-curve or p-curve, as they do not constitute repetitive research (i.e., but would be acceptable as part of a larger project). Instead, the focus is on the accumulation of evidence from replication/reproduction studies.
Theoretical contributions to methods of all types of replication research should be relevant to multiple research areas. These papers may introduce and validate new methods, or discuss the role and limitations of repetitive research in a specific context (e.g., the quality of reproduction studies in undergraduate theses).
Practical introductions to methods or tools relevant for conducting repetitive research.
Replication Research welcomes comments on articles published in our pages. If several comments make similar points, the editors will select one for review. Authors will be given the opportunity to submit one aggregated Reply to comments.
All articles and corresponding materials are published under a CC BY 4.0 license. For special cases involving ethical or legal restrictions, there is the option to provide access to the data only to the journal.
These guidelines are shared under a CC BY 4.0 and have been created by the members of the R2 editorial board. They are loosely based on the submission guidelines from Meta Psychology and Free Neuropathology.
The names and email addresses entered in this journal site will be used exclusively for the stated purposes of this journal and will not be made available for any other purpose or to any other party.
Our extended data protection policy is available here: https://ejournals.uni-muenster.de/index.php/index/dataprotectionpolicy