Abstract

There are formal calls for increased reproducibility and replicability in sports and exercise science, yet there is minimal information on the overall knowledge of these concepts at a field-wide level. Therefore, we conducted a survey on the attitudes and perceptions of sports and exercise science researchers towards reproducibility and replicability. Descriptive statistics (e.g., proportion of responses), and thematic analysis, were utilized to characterize the responses. Of the 511 respondents, 42% (n = 217) believe there is a significant crisis of reproducibility or replicability in sports and exercise science while 36% (n = 182) believe there is a slight crisis. 3% (n = 15) of respondents believe there is no crisis while 19% (n = 95) did not know. Four themes were generated in the thematic analysis: the research and publishing culture, educational barriers to research integrity, research responsibility to ensure reproducibility and replicability, and current practices facilitating reproducibility and replicability. Researchers believe that engaging in open science can be detrimental to career opportunities due to lack of incentives. They also feel journals are a barrier to reproducible and replicable research due to high publication charges and a focus on novelty. Statistical expertise was identified as a key factor for improving reproducibility and replicability in the future, particularly, a better understanding of study design and different statistical techniques. Statistical education should be prioritised for early career researchers which could positively affect publication and peer review. Researchers must accept responsibility for reproducibility and replicability with thorough project design, appropriate planning of analyses, and transparent reporting practices.

1 Introduction

The recent concept of replication has gained attention in psychology due to a failure to replicate studies (Klein et al., 2014; Open Science Collaboration, 2015). However, it has also expanded to other fields such as social science (Camerer et al., 2018), economics (Camerer et al., 2016), and cancer biology (Errington et al., 2021), whereby similar large replication projects suggested a crisis of confidence in research findings (Pashler & Wagenmakers, 2012). This replication crisis led to discussions around the replicability, reproducibility1, and transparency of research practices which helped inspire the open science movement (Munafo et al., 2017).

The response to the replication crisis was met with mixed reaction. Those in favor of replication studies believe they can increase (or decrease) confidence in research findings, update boundaries on findings i.e., the external validity (Nosek & Errington, 2020), identify type I errors, and control for sampling error (Schmidt, 2009). However, there are arguments that concerns regarding replication are overblown, as replicability is not an ideal for all disciplines in science and cannot be universally applied (Guttinger, 2020). Others believe it is a waste of valuable resources and misguided to undertake large replication efforts (Eufemia et al., 2018; Stroebe & Strack, 2014).

Due to the contrasting views on the value of replication and reproducibility, a Nature survey explored the opinions of researchers in different fields (Baker, 2016). Of 1576 researchers, 52% believed there was a significant reproducibility crisis and 38% believed there was a slight crisis in science. A similar survey of psychologists was conducted to understand the community’s opinion on the importance of replication (Buttliere & Wicherts, 2018); results showed the community viewed replications as an essential aspect of the research process to determine what effects are real. Although replication is one of the most obvious ingredients of science (Schmidt, 2009, p. 91), it is not the norm across all scientific disciplines causing a period of unrest amongst those who advocate for it.

The issues of replication have yet to be examined in sports and exercise science, despite several publications identifying methodological and statistical concerns, and advocating for increased replication studies within the discipline (Caldwell et al., 2020; Halperin et al., 2015; Heneghan et al., 2012; Knudson, 2017). Some single study replication attempts were published in the field (Chalmers et al., 2018; Morin et al., 2019; Pitsch & Emrich, 2011), and there is an ongoing large replication project (Murphy et al., 2022). Additionally, research groups were formed to improve the manner in which we conduct research in the field (e.g., STORK; the Society for Transparency, Openness and Replication in Kinesiology). Yet, as replication has not grasped the attention of sports and exercise science like other fields (e.g., psychology, social science, cancer biology and economics), there is limited field-wide discussion on the concept. Consequently, there is no understanding of the attitudes towards, and perception of, reproducibility and replication in sports and exercise science to date. It is therefore difficult to gauge how accepting sports and exercise science researchers are of reproducibility and replicability at a field-wide level and, if opposed to it, the reasons for reluctance to embrace changes.

Publishers and journals are accused of prioritizing novel findings over replication studies for higher impact and to increase their journal metrics (Chambers et al., 2014; Nosek et al., 2012), which detracts from replication efforts. Replication is also considered to be a less inferior and creative method of research by some (Makel & Plucker, 2014). Other researchers are opposed to replication as they feel it is a personal attack on their work and a hostile action (Nosek et al., 2022, p. 20). Thus, it is crucial to understand the barriers to the open science movement, particularly replication for this field, as this movement is affecting all areas of social science. By identifying the barriers to undertaking replication, changes can be implemented to incentivise researchers to adapt their methods and improve research practices. This information is essential to facilitate an increased number of replication studies, build awareness of current practices, and increase collaboration and transparency amongst researchers and statisticians alike (Caldwell et al., 2020; Sainani et al., 2020).

The purpose of this survey is to explore the attitudes and perceptions of researchers towards reproducibility and replicability in the field of sports and exercise science, by adapting the established Nature survey (Baker, 2016). The objectives of this study are to understand the community awareness of the terms reproducibility and replicability, and the attitudes towards these concepts, and to identify potential barriers to reproducibility and replicability in sports and exercise science.

2 Methods

2.1 Recruitment Strategy

To be included in this study, participants must be active researchers, therefore, the sample was limited to researchers who had published in a sports and exercise science journal in the previous 5 years to the survey distribution (2016 - 2021). As per the preregistration, we aimed to have a final sample size close to 2000. This sample size was based on similar surveys to our topic (Baker, 2016; Buttliere & Wicherts, 2018; Ross-Hellauer et al., 2017). All participants were informed through the survey website that it was anonymous and voluntary. Participants were informed that the study results and underlying data would be published. Participants provided consent using a digital informed consent form that was completed prior to beginning the survey.

2.2 Participants

There were 511 responses to the survey representing a response rate of 2.7%. For the demographics, 38% were from North America, 37% from Europe, 12% from Australasia, 6% from Asia, 5% from South America, and 2% from Africa. 31% of respondents were aged between 25 and 34 years, 36% from 35 to 44 years, and 18% from 45 to 54 years. Most respondents selected Associate Professor as their main job role (27%), followed by Professor (21%), Post-doctoral Fellow (10%), and PhD student (8%).

2.3 Preregistration Deviation

We originally planned to contact 10,000 sports and exercise science researchers via the mailing list of corresponding authors who had published in sports and exercise science journals according to the Web of Science research database (www.webofknowledge.com). However, we deviated from the preregistration due to very low response rates and contacted a total of 23,690 researchers instead. These were sent between May and July 2021 and 18,854 emails were delivered. The undelivered emails (N = 4836) were due to researchers moving institutions, university spam filters and other unknown reasons. We hypothesize low response rates could be a result of the survey length (mean completion time = 68:21 minutes), time of distribution (summer/university holidays) and no follow up reminder.

2.4 Experimental Design

The survey was adapted from a previously published Nature survey which explored scientist’s opinions on reproducibility in their field and other fields (Baker, 2016). Minor adaptations included the addition of questions relating to replication to those already focused on reproducibility. Questions were adapted to be specific to sports and exercise science such as In the field of sports and exercise science…. This survey included 20 short sections and 45 questions with a focus on: familiarity of terminology; perception of the reproducibility/replication crisis; the proportion of published results that are reproducible or replicable; funder and publisher efforts to improve reproducibility and replicability; established procedures for reproducibility and replicability, and the impact of these on the laboratory; barriers to reproducibility and replicability; contributory factors to a failure to reproduce or replicate results; and factors that would improve reproducibility and replicability. The following definitions were provided in the survey: reproducibility is defined as retesting a claim using the same analyses and same data, whereas replication is retesting a claim using the same analyses and new data (Nosek & Errington, 2020).

Both multiple choice answers and free text boxes were used in the survey. We included open text boxes to capture opinions on reproducibility and replication that multiple choice questions potentially missed. Question skip logic was applied so participants did not have to respond to a question where the answer to the previous question made it irrelevant. The survey is available in full online along with the data, R code and supplementary materials (https://doi.org/10.17605/OSF.IO/64R8M). The preregistration is also available online (https://doi.org/10.17605/OSF.IO/EXK6N). Ethical approval was granted by Technological University Dublin (REC-PGRI-202021).

2.5 Quantitative Data Management and Statistical Analysis

The final analysis included survey responses which were fully completed and where digital consent was received. Data was collected via an encrypted, password protected online survey software, Microsoft Forms (version 16.63.1; Microsoft Office, Mountain View, CA, USA). There were 10 sections with free text data which consisted of brief sentences in response to the open-ended questions. These responses were transferred to a Microsoft Excel spreadsheet (version 16.63.1; Microsoft Office, Mountain View, CA, USA). Descriptive statistics were conducted for the categorical data (e.g., proportion of responses) using R (version 4.2.1) (2022).

2.6 Thematic Analysis Approach

The research question for this study was addressed using a reflexive thematic analysis approach. This approach involves the researcher’s reflective and thoughtful engagement with their data and their reflexive and thoughtful engagement with the analytic process (Braun & Clarke, 2019, p. p594). As we analysed the data with our aim in mind, the themes are strongly related to the research question and were driven by the researcher’s theoretical interest. This is indicative of a deductive analysis; however, inductive analysis was also employed to ensure full interpretation of the data content. Using this type of analysis, responses were open coded to best represent meaning from the participants and a pre-specified coding book was not used.

Semantic coding was initially used to identify themes through engagement with the surface meaning of the data; key words and phrases were highlighted on hard copies of the transcripts. However, our coding approach was not exclusively semantic as we also interpreted the meaning underpinning responses from the participants in subsequent readings of the data (i.e., latent coding) (Braun & Clarke, 2019). The codes and their corresponding data extracts were then organised into theme piles (Braun & Clarke, 2006) and subsequently revised and developed. When codes were organised based on recurring patterns, the sub-themes were formed. These sub-themes were then linked to one another and grouped to form a major theme. Our last step was to collate the data extracts in the table with their corresponding sub-themes and themes. Data extracts were selected in the results for the highest clarity for theme representation, but the dataset is fully available on the OSF project page.

3 Results

Of the 511 respondents, 47% (n = 239) of respondents were very familiar and 39% (n = 200) were fairly familiar with the term reproducibility, while 30% (n = 152) were very familiar and 35% (n = 181) were fairly familiar with the term replicability. Over three-quarters (78.1%) of these respondents believe there is a replication and reproducibility crisis in sports and exercise science (Figure 1).

Figure 1. Descriptive results of the response to the survey question about a reproducibility crisis or replication crisis in sports and exercise science.

When responding to a question asking whether they encountered barriers to implementing changes that would improve reproducibility and replicability in the laboratory, 37% of respondents (n = 189) identified barriers, 42% (n = 217) did not, and 20% (n = 102) were unsure. Furthermore, when answering a question on the factors that contribute to a study failing to replicate, respondents believe poor experimental design, insufficient mentoring, publishing pressure, and selective reporting were among the highest contributing factors (Figure 2). The number to the left of the bar indicates the percentage of participants who responded with always contributes, very often contributes, or sometimes contributes while the number on the right indicates the percentage of participants who responded with rarely contributes or never contributes. The center of the bar (grey) indicates those who responded, I don’t know. Statements are ordered according to the total percentage of agreement.

Figure 2. Descriptive results of the response to the survey question on factors contributing to a failure to replicate.

3.1 Thematic Analysis Results

Four key themes were generated from the data after the thematic analysis was applied (Supplementary Tables 1-4). They were the following: the research and publishing culture, educational barriers to research integrity, research responsibility to ensure reproducibility and replicability, and current practices facilitating reproducibility and replicability in the field. A summary of the results is presented below, and the tables include selected quotes and information directly from the respondents for clarity.

3.1.1 Key Theme 1: The Research and Publishing Culture

Under the main theme of the research and publishing culture (Supplementary Table 1), there were three recurring sub-themes identified as barriers to replication which were: incentives for undertaking replication research, priority of novel research, and the business model of publishing. Survey respondents believe that engaging in open science, or conducting replication studies, will be detrimental to career progression due to lack of incentives. Sports and exercise science researchers feel pressurized to produce a high quantity of research studies due to a high level of competition for career and funding opportunities. Furthermore, according to respondents, novel research is prioritized over studies that are methodologically sound, and this is exacerbated by journal bias.

Journals were described as a barrier to reproducible research by actively promoting the file drawer issue, as they often reject research which is not considered novel or is non-significant. Researchers also expect to be criticized for publishing replication studies and feel there is no value placed on them, especially in higher-ranked or prestigious journals i.e., quartile 1 journals. Additionally, researchers feel that journals are a barrier to reproducible research as scientific publishing is a billion-dollar business now. Lastly, they believe publishers are often profit focused and publication fees further exacerbates the file drawer problem as unfunded researchers will simply not publish.

3.1.2 Key Theme 2: Educational Barriers to Research Integrity

Under the main theme of educational barriers to research integrity (Supplementary Table 2), there were two recurring sub-themes which were: quality of peer review, and statistical expertise and knowledge of researchers. There were mixed views on the role of peer review for upholding values of research integrity, yet there is agreement on the importance of statistical knowledge for peer reviewers. Respondents identified greater scrutiny is needed by peer reviewers on study design. However, a lack of a formalized education process or screening for peer reviewers has led to the inability of some reviewers to assess poor analyses, lack of controls, or to recognize bias. Statistical expertise was a clear recurring theme throughout many responses, specifically researchers’ statistical education. Many researchers feel that a better understanding of study design, and the use of different statistical techniques to analyse data, would improve reproducibility and replicability within the field. Errors with data management and statistical techniques application were discussed as common factors that affect reproducibility and replicability of this field.

3.1.3 Key Theme 3: Research Responsibility to Ensure Reproducibility and Replicability

Under the main theme of research responsibility to ensure reproducibility and replicability (Supplementary Table 3), there were three recurring sub-themes: journal responsibility, researcher responsibility, and senior researcher/supervisor responsibility. The ownership of responsibility to ensure reproducibility and replicability in the research process was heavily debated in the responses.

Some believe journals are responsible for promoting transparency; there should be basic criteria for sample size justification, reporting and analysis, and flexibility with journal article length would be helpful. As we move into a more digital era, researchers appear frustrated with the lack of a corresponding increase in page limits which would decrease the selective reporting of results. Journals can facilitate and encourage open science practices via author guidelines, types of publications requested i.e., replication studies, and can enforce reporting criteria for readers and authors. Essentially, they have an opportunity to be leaders in implementing policies; they should be fostering changes rather than just policing. On the other hand, some researchers feel that journals have too much research power; they should have a smaller role rather than act as gatekeepers in science.

Some respondents believe that the responsibility for ensuring reproducibility and replicability should be with the researchers. Publication is the last stage of the research process, so it is the researcher’s responsibility to maximise transparency in their reporting practices and appropriately design their studies. Finally, supervisors were specifically identified as having a responsibility to promote open science practices for reproducible and replicable research with early career researchers and students. The promotion of these practices by the supervisors appears to determine the engagement of other researchers within the laboratory or research group according to respondents.

Researchers in the field also believe that individuals overestimate the level of statistical expertise they have. Some theorize that this applies to both early career researchers and supervisors. Supervisors also have an important role as mentors and should educate themselves, and their students, on the importance of reproducibility and replicability. Respondents believe more collaboration with statisticians and data analysts would be helpful to improve their own knowledge and account for any shortfalls in their knowledge that could affect research transparency and quality.

3.1.4 Key Theme 4: Current Practices Facilitating Reproducibility and Replicability in the Field

Under the main theme of current practices facilitating reproducibility and replicability in the field (Supplementary Table 4), there were two recurring sub-themes which were: data sharing and checklist usage. There appears to be mixed views on open data or data sharing from researchers in the field; journals are encouraging data sharing, which is deemed positive, but there is little enforcement or standardization of this. Many respondents have concerns with data sharing; there are potential career disadvantages to forcing all data and code to be shared, for example, some authors fear being scooped. Secondly, for the author, open datasets are time consuming because they must be organised in a readable format.

Finally, respondents believe it difficult to ascertain whether data badges and sharing are having a positive effect, therefore, they are unsure whether they are worthwhile. There is also a general sense of frustration with the use of checklists when submitting research for publication. Respondents feel they are currently too generic, applied inconsistently and without rationale, and are frequently ignored during the peer review process. Some researchers feel they should be compulsory, and the study should not be published if the checklists are not followed appropriately. In contrast, many respondents declared they should be banned altogether.

Lastly, and although not specifically linked to the themes identified in the thematic analysis, there were multiple comments regarding the attitude towards open science as a movement (Supplementary Table 5). Some respondents believe a few open science advocates are actively trying to discredit other researchers’ work or specifically targeting research groups. Others reported the negative perception around failed replication studies discourages them from attempting replication.

4 Discussion

The overall aim of this study was to determine the attitudes towards, and perception of, reproducibility and replicability in sports and exercise science researchers. Survey results showed three-quarters of the respondents believe there is a crisis of reproducibility and replicability in the field, while 42.5% believe this crisis is significant. The concerns regarding replicability and reproducibility are lower than those of Baker (2016) where 90% of researchers across different scientific disciplines acknowledged the existence of a reproducibility crisis. We expect the lower rate of concerns reflect the minimal discourse on replication in sports and exercise science. Additionally, the potential naivety that science is functionally well in the field, despite identified concerns among some researchers, could have contributed to this lower rate. Four key themes were also generated in the thematic analysis: the research and publishing culture, educational barriers to research integrity, research responsibility to ensure reproducibility and replicability, and current practices facilitating reproducibility and replicability, which we have interpreted and grouped in the results. Therefore, the remainder of this section will discuss the context and implications of these thematic areas, as well as suggestions for future practices.

As identified in the theme of the research and publishing culture, researchers feel that sport and exercise science is currently under siege from competition, commercialization and metrics. These create a research culture that is largely driven by career incentives and novel research (Chambers et al., 2014; Nosek et al., 2012; Smaldino & McElreath, 2016). The pressure to publish is exacerbated by competition within academia; there are more PhDs being produced in world universities than there are permanent academic positions (Powell, 2015). Publication influences hiring, promotion and grant decisions which are considered a marker of achievement (Fanelli et al., 2017), consequently, the publication process is negatively perceived by some researchers due to overwhelming academic pressure (de Vrieze, 2021).

Academic pressures are similarly apparent in sports and exercise science as pressure to publish was identified as one of the highest contributing factors towards a failure to replicate or reproduce findings in our survey (Figure 2). This is unsurprising given the survey respondents feel pressure to produce a large quantity of research output, potentially without regard for the quality or transparency of that research to keep up with their peers. Furthermore, 62.8% of clinical cancer researchers admit publishing pressure influences their reporting while 23% believe selective reporting or manipulating data was necessary to prove a hypothesis (Boulbes et al., 2018). One could argue that our field could be suffering from the same assumption, and we may have a crisis of incentives on our hands.

Sports and exercise science researchers reported they are disincentivized to undertake replication studies due to the priority of novel research and the belief that replications lack creativity (we know that reviewers are seeking novelty in the work, and I would expect to be criticized if I submitted a replication study). This finding is similar to other fields (Nosek et al., 2012), therefore, these researchers are as much victims as they are facilitators of poor scientific behaviors. They are incentivized to engage in poor, or potentially dishonest, practices (John et al., 2012) simply because of the trade-off between quantity and quality in sports and exercise science, of which quantity is winning (Allen & Mehler, 2019) (I believe that academia pushes for greater scientific output at the cost of its quality). There needs to be a change in culture for individuals.

A healthy research culture, which rewards quality rather than publication volume, would improve replicability and reproducibility within the field. These are not simple changes; they require structural changes at a cultural, university and publishing level. Achievable changes can be made in the short term, which will set the foundations for improved culture practices in the future. Examples of these changes include organizing a journal club to discuss open science practices, preregistering studies, adopting preprints, using a dedicated and transparent project workflow system etc. The adoption of open science can be overwhelming as it has many different facets, but Kathawalla et al. (2021) created a helpful guide to assist students and advisers with their journeys into open science. The current accepted norms of pressure to publish will continue until the incentive structure changes within the field.

For researchers, there is a temptation to produce and prioritize work which is novel for career success (Chambers et al., 2014). Novel or impressive findings are a primary goal of the current academic culture (Bernards et al., 2017). This is evident by the 2500% increased frequency of words such as innovative, novel and ground-breaking in abstracts of PubMed articles from 1974 to 2017 (Vinkers et al., 2015). The demand for novel research is also apparent in our field and it instills a need for researchers to produce statistically significant findings. According to our survey respondents, selective reporting of novel or positive results was one of the highest contributing factors towards a failure to reproduce or replicate studies (Figure 2). This is supported by the implausibly high positive result rate of 81% across 300 studies in three flagship sports and exercise science journals (Twomey et al., 2021). Similarly, a positive result rate of 82% was reported for four high impact sports medicine and physiotherapy journals (Büttner et al., 2020). Many clinical cancer researchers (47%) also felt pressured to produce a positive result by a collaborator (Boulbes et al., 2018), and based on our survey responses, this proportion could be higher in our field.

Non-significant or less exciting results are often shunned by journals due to lower citation practices (Fanelli et al., 2017). A consequence is that sports and exercise science researchers are possibly disinclined to submit these types of results for publication, and they are relegated to the file drawer (Rosenthal, 1979). Significant, novel findings are therefore worthy of publication while null or less exciting results will not be observed by the scientific community (I have had papers rejected on the basis that the results weren’t ’positive or significant. We all have. Journals perpetuate the problem by prioritizing novel findings.). Publication bias can alarmingly distort the proportion of true effects in the literature body rendering many study findings non-replicable.

The crucial step of verification or replication is rarely taken in sports and exercise science while journals are breeding poor scientific behaviors (Chambers et al., 2014). However, changes are ongoing to prevent selective reporting of results as Registered Reports are now offered as a publishing format (Chambers et al., 2014). Registered Reports undergo two rounds of peer review, before and after data collection, so that the manuscript could have an in-principal acceptance before any results are obtained. Although this format of publication is offered by many journals (see cos.io/rr), it is only beginning to be offered by sports and exercise science journals (Abt et al., 2021; Impellizzeri et al., 2019). Sport and exercise science must undertake a collective effort, where possible, to support journals who promote open practices and guidelines, rather than a focus on profit or their impact factor, a controversial metric (Heathers, 2022). This may be easier for those who have more career security e.g., tenured researchers, and leadership from these more senior researchers on this issue would greatly improve adoption of better publishing practices.

Statistical education was a key recurring theme throughout the thematic analysis and is supported by the quantitative results as respondents selected poor experimental design, inadequate mentoring, low statistical power, and mistakes as contributing factors towards a failure to replicate. Statistical and methodological errors are prevalent in sports and exercise science (Borg, Lohse, et al., 2020; Knudson, 2017; Nielsen et al., 2017). The use of controversial statistical methods even resulted in mainstream media criticism (Aschwanden & Nguyen, 2018; Sainani et al., 2019). Consequently, some researchers advocate for increased collaboration with statisticians within the field and we echo those calls (Sainani et al., 2020; Sainani & Chamari, 2022). This recommendation requires a shift in the culture norm, but perhaps larger structural changes are required for the long-term health of the sports and exercise science academic system. A redirection of attention to the impact of open science practices on students could be instrumental for the future of our field (Pownall et al., 2022). The introduction of preregistration was perceived as a helpful planning tool in the education of undergraduate psychology students and could promote best research practices, thereby reducing questionable research practices (Blincoe & Buchert, 2019). Similarly, replication studies could be encouraged as part of student projects. For example, there was the Hagen Cumulative Science Project (Jekel et al., 2019), and the Collaborative Replications and Education Project (Wagge et al., 2019).

When replication studies are integrated as part of academic training, students report an increased understanding of the research process, increased confidence with statistical methods, and find the overall experience quite positive (Smith et al., 2021; Stojmenovska et al., 2019). The incorporation of reproducible and replicable practices by early career researchers could improve the outlook of sports and exercise science by positively influencing the accuracy of reporting, which respondents identified as problematic for research quality (I think many times researchers believe that they know more about research than they do, making serious errors in methodology, using the wrong statistical tests, or not having clear objectives that they know how to accomplish.). Prioritization of statistical education may also have a positive impact on peer reviewers when early career researchers eventually assume this role. Therefore, the sports and exercise science field will reap the reward of an investment in better statistical education in the future.

There were mixed views on the responsibility of sports and exercise science journals for ensuring reproducibility and replicability. Some respondents believe journals should promote reproducibility and replicability (Journals can certainly facilitate good open science practices among academics), while others believe researchers are responsible (I think the researchers should own and drive it). Reporting guidelines and checklists were introduced by journals over a decade ago (Atkinson et al., 2008), although they do not appear to be used frequently (Twomey et al., 2021), even though their use was shown to increase the quality of reporting in medical journals (Turner et al., 2012). The Transparency and Openness Promotion (TOP) guidelines were created by the Center for Open Science to enhance journal transparency (Nosek et al., 2015). The mean TOP factor (https://osf.io/t2yu5/) for 38 sports and exercise science journals was 2.05 ± 1.99 out of 27 for engagement with openness and transparency (Hansford et al., 2022). This low score demonstrates an opportunity for these journals to review their open science policies and implement changes to increase transparency and move the sports and exercise science field forward. There was a clear consensus in the responses that journals are almost sole gatekeepers in science as they have a large proportion of research responsibility but frequently reject replication studies (The journals wield a double-edged sword when it comes to replication and reproducibility).

We, as sports and exercise science researchers, need to assume responsibility of our study design(s) rather than expecting improvements to be suggested during the peer review process. Peer review is not designed to verify findings; that expectation is too much for a voluntary role (Mellor, 2021). Even if it was, it is only possible if the data and code are shared. As this is not the norm in sports and exercise science (Borg, Lohse, et al., 2020), peer reviewers are limited to reviewing the claims based on the limited information provided in the manuscript. We suggest spending more time and attention on our study designs (Mesquida et al., 2022; Scheel et al., 2020), undertake preregistration and specify our hypotheses2, and collaborate with statisticians to improve our statistical inferences (Sainani et al., 2020). Essentially, we must assume responsibility for reproducibility and replicability ourselves, as opposed to offsetting the responsibility elsewhere (i.e., peer reviewers).

Like reporting guidelines and checklists, data sharing guidelines are present in many sports and exercise science journals. Although, data sharing would facilitate reproducibility and replicability, the guidelines are not often enforced according to survey respondents (The implementation of and adherence to checklists and standards is very haphazard). Of 300 sports and exercise science articles, only 2.33% had a data accessibility statement while 0.67% reported open data or code (Twomey et al., 2021). In a similar analysis of 299 sports and exercise science studies, only 4.3% of 299 articles shared data while 1.7% stated data was available on request3, and no study shared any code or syntax related to the statistical analysis (Borg, Bon, et al., 2020). There is some reluctance to share data due to concerns regarding scooping, where another author or research group obtains the data and publishes first (…having to give away work that they would otherwise be able to leverage to get a head start on future publications to bigger (and hence faster-moving) groups is a real problem). This concern is shared by researchers in other fields, who view open data access as a beneficial process for the development of the scientific system of knowledge but not for an individual researcher and their prospective career (Ostaszewski, 2014).

Researchers are fearful that open data might lead to misuse or misinterpretation of that data (Ostaszewski, 2014). Yet, as data and code availability are essential for future replication and meta-analyses, identifying errors during the scientific process must be normalized and communicated in a respectful but factual manner. We, as researchers, make mistakes (Nuijten et al., 2015), and a process of long-term self-correction is important for research validity. Furthermore, citation counts are higher for studies with open data (Piwowar & Vision, 2013). There are initiatives to encourage data sharing such as open data badges and the Peer Reviewer’s Openness Initiative (Morey et al., 2016). Although there can be issues around data sharing (e.g., ethical considerations, intellectual property, data is part of a longitudinal project), one could release a limited set of variables (excluding those that threaten privacy), embargo the dataset, or share a simulated dataset (Borg, Bon, et al., 2020). Sharing data increases its utility whereas closed science decreases its usability over time (Vines et al., 2014). When data sharing is not possible, sharing of code, instruments and analysis materials are still valuable for replication and should be encouraged in sports and exercise science.

Finally, there were some comments from survey respondents about the open science movement in general. Some respondents reported a negative perception around failed replications. This indicates an increased need to educate researchers on the meaning of a non-replicable finding; it does not automatically undermine the original study results, or mean they are false (Maxwell et al., 2015). There are a number of reasons a replication study will have dissimilar results to the original study including: unanticipated differences in the studies, low statistical power, or large heterogeneity in effect size estimates (Klein et al., 2018). Perhaps the term failed should be removed from replication research altogether as it infers negativity. Regardless of the replication outcome, there must be respectful communication to original authors (Janz & Freese, 2020) and consideration of the tone of scientific critique4 The open science movement aims to improve the current biased and exclusive academic system (Kent et al., 2022), and must be inclusive of all types of researchers: students, early career researchers and senior researchers. In other words, a shift in the current closed research culture and gate-keeping should be a goal of future researchers in this field.

4.1 Limitations

There are several limitations of this survey. Firstly, there was a high level of familiarity with the terms reproducibility and replicability; this indicates that the respondents were biased towards open science and were more likely to participate i.e., survey bias. The survey was specifically not advertised on social media to minimize this as best as possible, but it is highly likely that our respondents also shared an interest in this topic. Secondly, the survey was adapted from Baker (2016) who used the terms reproducibility and replicability interchangeably. For this survey, definitions for reproducibility and replicability were given. However, for question 95, these constructs were ill-defined and used interchangeably. For example, question 9 states the results of a given study could be replicated exactly or reproduced in multiple similar experimental systems with variations of experimental settings such as materials and experimental model. This could be viewed as misleading for the participants as the answer should reflect the union of two different constructs. Additionally, some of the Likert questions were incorrectly balanced i.e., in Figure 2 there were more options for negative answers than positive. This is a limitation of the original study from which this survey was adapted that was not corrected here. Finally, the participants had the option of not answering questions with an open text box response, therefore, the respondents who had an opinion may be more inclined to answer i.e., response bias.

4.2 Conclusion

More than three-quarters of respondents believe there is a reproducibility and replicability crisis in sports and exercise science. In the thematic analysis, respondents believe novel research is prioritized over methodologically sound research, and publication quantity over quality. There was a consensus that journals currently have too much research power and the guidelines/policies they have in place for increasing transparency (reporting checklists and data sharing guidelines) are not enforced sufficiently. Statistical education was also highlighted as a contributing factor towards poor reproducibility and replicability in the field. We recommend assuming increased responsibility for ensuring the reproducibility and replicability of our own work by appropriately designing studies, preregistering hypotheses, collaborating with statisticians, and sharing data. We also recommend the inclusion of open science practices as part of early career researcher education, including replication studies as a potential replacement for the traditional thesis, as well as an open mind towards other replication attempts. The strategic implementation of small changes will ultimately benefit the reproducibility and replicability of the field in the future and seeing examples of open science practices should then increase uptake, particularly amongst early career researchers in the long term.

5 Additional Information

5.1 Data Accessibility

The survey data, R code and supplementary materials are available online at https://doi.org/10.17605/OSF.IO/64R8M while the preregistration is also available online at https://doi.org/10.17605/OSF.IO/EXK6N.

5.2 Author Contributions

  • Contributed to conception and design: JM, JPW
  • Contributed to acquisition of data: JM
  • Contributed to analysis and interpretation of data: JM, JPW
  • Drafted and/or revised the article: JM, JPW, CM
  • Approved the submitted version for publication: JM, JPW, CM

5.3 Conflict of Interest

The authors report there are no competing interests to declare.

5.4 Funding

This work was supported by the Irish Research Council’s Government of Ireland Postgraduate Scholarship Programme (GOIPG/2020/1155).

5.5 Acknowledgments

The authors would like to thank Aaron Caldwell for his helpful feedback on the manuscript and the R code for the graphs. We would also like to thank the survey respondents for their time, particularly those who contacted us with feedback about the survey.

5.6 Preprint

The pre-publication version of this manuscript can be found on SportRxiv (DOI: 10.51224/SRXIV.234).

6 References

Abt, G., Boreham, C., Davison, G., Jackson, R., Wallace, E., & Williams, A. M. (2021). Registered Reports in the Journal of Sports Sciences. Journal of Sports Sciences, 39(16), 1789–1790. https://doi.org/10.1080/02640414.2021.1950974
Allen, C., & Mehler, D. M. A. (2019). Open science challenges, benefits and tips in early career and beyond. PLOS Biology, 17(5), e3000246. https://doi.org/10.1371/journal.pbio.3000246
Aschwanden, C., & Nguyen, M. (2018). In FiveThirtyEight. https://fivethirtyeight.com/features/how-shoddy-statistics-found-a-home-in-sports-research/
Atkinson, G., Batterham, A., & Drust, B. (2008). Is it Time for Sports Performance Researchers to Adopt a Clinical-Type Research Framework? International Journal of Sports Medicine, 29(09), 703–705. https://doi.org/10.1055/s-2008-1038545
Baker, M. (2016). 1,500 scientists lift the lid on reproducibility. Nature, 533(7604), 452–454. https://doi.org/10.1038/533452a
Bernards, J., Sato, K., Haff, G., & Bazyler, C. (2017). Current Research and Statistical Practices in Sport Science and a Need for Change. Sports, 5(4), 87. https://doi.org/10.3390/sports5040087
Blincoe, S., & Buchert, S. (2019). Research Preregistration as a Teaching and Learning Tool in Undergraduate Psychology Courses. Psychology Learning &Amp; Teaching, 19(1), 107–115. https://doi.org/10.1177/1475725719875844
Borg, D. N., Bon, J. J., Sainani, K. L., Baguley, B. J., Tierney, N. J., & Drovandi, C. (2020). Comment on: “Moving Sport and Exercise Science Forward: A Call for the Adoption of More Transparent Research Practices.” Sports Medicine, 50(8), 1551–1553. https://doi.org/10.1007/s40279-020-01298-5
Borg, D. N., Lohse, K. R., & Sainani, K. L. (2020). Ten Common Statistical Errors from All Phases of Research, and Their Fixes. PM&R, 12(6), 610–614. https://doi.org/10.1002/pmrj.12395
Boulbes, D. R., Costello, T., Baggerly, K., Fan, F., Wang, R., Bhattacharya, R., Ye, X., & Ellis, L. M. (2018). A Survey on Data Reproducibility and the Effect of Publication Process on the Ethical Reporting of Laboratory Research. Clinical Cancer Research, 24(14), 3447–3455. https://doi.org/10.1158/1078-0432.ccr-18-0227
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa
Braun, V., & Clarke, V. (2019). Reflecting on reflexive thematic analysis. Qualitative Research in Sport, Exercise and Health, 11(4), 589–597. https://doi.org/10.1080/2159676x.2019.1628806
Buttliere, B., & Wicherts, J. M. (2018, May 25). Opinions on the value of direct replication: A survey of 2,000 psychologists. CABI Publishing. https://doi.org/10.31234/osf.io/z9kx6
Büttner, F., Toomey, E., McClean, S., Roe, M., & Delahunt, E. (2020). Are questionable research practices facilitating new discoveries in sport and exercise medicine? The proportion of supported hypotheses is implausibly high. British Journal of Sports Medicine, 54(22), 1365–1371. https://doi.org/10.1136/bjsports-2019-101863
Caldwell, A. R., Vigotsky, A. D., Tenan, M. S., Radel, R., Mellor, D. T., Kreutzer, A., Lahart, I. M., Mills, J. P., & Boisgontier, M. P. (2020). Moving Sport and Exercise Science Forward: A Call for the Adoption of More Transparent Research Practices. Sports Medicine, 50(3), 449–459. https://doi.org/10.1007/s40279-019-01227-1
Camerer, C. F., Dreber, A., Forsell, E., Ho, T.-H., Huber, J., Johannesson, M., Kirchler, M., Almenberg, J., Altmejd, A., Chan, T., Heikensten, E., Holzmeister, F., Imai, T., Isaksson, S., Nave, G., Pfeiffer, T., Razen, M., & Wu, H. (2016). Evaluating replicability of laboratory experiments in economics. Science, 351(6280), 1433–1436. https://doi.org/10.1126/science.aaf0918
Camerer, C. F., Dreber, A., Holzmeister, F., Ho, T.-H., Huber, J., Johannesson, M., Kirchler, M., Nave, G., Nosek, B. A., Pfeiffer, T., Altmejd, A., Buttrick, N., Chan, T., Chen, Y., Forsell, E., Gampa, A., Heikensten, E., Hummer, L., Imai, T., … Wu, H. (2018). Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015. Nature Human Behaviour, 2(9), 637–644. https://doi.org/10.1038/s41562-018-0399-z
Chalmers, S., Debenedictis, T. A., Zacharia, A., Townsley, S., Gleeson, C., Lynagh, M., Townsley, A., & Fuller, J. T. (2018). Asymmetry during Functional Movement Screening and injury risk in junior football players: A replication study. Scandinavian Journal of Medicine &Amp; Science in Sports, 28(3), 1281–1287. https://doi.org/10.1111/sms.13021
Chambers, C., Feredoes, E., Muthukumaraswamy, S., & Etchells, P. (2014). Instead of “playing the game” it is time to change the rules: Registered Reports at <em>AIMS Neuroscience</em> and beyond. AIMS Neuroscience, 1(1), 4–17. https://doi.org/10.3934/neuroscience.2014.1.4
de Vrieze, J. (2021). Large survey finds questionable research practices are common. Science, 373(6552), 265–265. https://doi.org/10.1126/science.373.6552.265
Derksen, M., & Field, S. (2021). The Tone Debate: Knowledge, Self, and Social Order. Review of General Psychology, 26(2), 172–183. https://doi.org/10.1177/10892680211015636
Errington, T. M., Mathur, M., Soderberg, C. K., Denis, A., Perfito, N., Iorns, E., & Nosek, B. A. (2021). Investigating the replicability of preclinical cancer biology. eLife, 10. https://doi.org/10.7554/elife.71601
Eufemia, L., Bonatti, M., & Lana, M. A. (2018). Colombia’s rural development must honour peace agreement. Nature, 560(7716), 29–29. https://doi.org/10.1038/d41586-018-05847-x
Fanelli, D., Costas, R., & Ioannidis, J. P. A. (2017). Meta-assessment of bias in science. Proceedings of the National Academy of Sciences, 114(14), 3714–3719. https://doi.org/10.1073/pnas.1618569114
Gabelica, M., Bojčić, R., & Puljak, L. (2022). Many researchers were not compliant with their published data sharing statement: a mixed-methods study. Journal of Clinical Epidemiology, 150, 33–41. https://doi.org/10.1016/j.jclinepi.2022.05.019
Guttinger, S. (2020). The limits of replicability. European Journal for Philosophy of Science, 10(2). https://doi.org/10.1007/s13194-019-0269-1
Halperin, I., Pyne, D. B., & Martin, D. T. (2015). Threats to Internal Validity in Exercise Science: A Review of Overlooked Confounding Variables. International Journal of Sports Physiology and Performance, 10(7), 823–829. https://doi.org/10.1123/ijspp.2014-0566
Hansford, H. J., Cashin, A. G., Wewege, M. A., Ferraro, M. C., McAuley, J. H., Jones, M. D. (2022). Open and transparent sports science research: the role of journals to move the field forward. Knee Surgery, Sports Traumatology, Arthroscopy, 30(11), 3599–3601. https://doi.org/10.1007/s00167-022-06893-9
Heathers, J. (2022). Impact Factor Manipulation. Open Science Framework. https://doi.org/10.17605/OSF.IO/4C6XA
Heneghan, C., Perera, R., Nunan, D., Mahtani, K., & Gill, P. (2012). Forty years of sports performance research and little insight gained. BMJ, 345(jul18 3), e4797–e4797. https://doi.org/10.1136/bmj.e4797
Impellizzeri, F. M., McCall, A., & Meyer, T. (2019). Registered reports coming soon: our contribution to better science in football research. Science and Medicine in Football, 3(2), 87–88. https://doi.org/10.1080/24733938.2019.1603659
Janz, N., & Freese, J. (2020). Replicate Others as You Would Like to Be Replicated Yourself. PS: Political Science &Amp; Politics, 54(2), 305–308. https://doi.org/10.1017/s1049096520000943
Jekel, M., Fiedler, S., Allstadt Torras, R., Mischkowski, D., Dorrough, A. R., & Glöckner, A. (2019). How to Teach Open Science Principles in the Undergraduate Curriculum—The Hagen Cumulative Science Project. Psychology Learning &Amp; Teaching, 19(1), 91–106. https://doi.org/10.1177/1475725719868149
John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the Prevalence of Questionable Research Practices With Incentives for Truth Telling. Psychological Science, 23(5), 524–532. https://doi.org/10.1177/0956797611430953
Kathawalla, U.-K., Silverstein, P., & Syed, M. (2021). Easing Into Open Science: A Guide for Graduate Students and Their Advisors. Collabra: Psychology, 7(1). https://doi.org/10.1525/collabra.18684
Kent, B. A., Holman, C., Amoako, E., Antonietti, A., Azam, J. M., Ballhausen, H., Bediako, Y., Belasen, A. M., Carneiro, C. F. D., Chen, Y.-C., Compeer, E. B., Connor, C. A. C., Crüwell, S., Debat, H., Dorris, E., Ebrahimi, H., Erlich, J. C., Fernández-Chiappe, F., Fischer, F., … Weissgerber, T. L. (2022). Recommendations for empowering early career researchers to improve research culture and practice. PLOS Biology, 20(7), e3001680. https://doi.org/10.1371/journal.pbio.3001680
Klein, R. A., Ratliff, K. A., Vianello, M., Adams, R. B., Jr., Bahník, Š., Bernstein, M. J., Bocian, K., Brandt, M. J., Brooks, B., Brumbaugh, C. C., Cemalcilar, Z., Chandler, J., Cheong, W., Davis, W. E., Devos, T., Eisner, M., Frankowska, N., Furrow, D., Galliani, E. M., … Nosek, B. A. (2014). Investigating Variation in Replicability. Social Psychology, 45(3), 142–152. https://doi.org/10.1027/1864-9335/a000178
Klein, R. A., Vianello, M., Hasselman, F., Adams, B. G., Adams, R. B., Jr., Alper, S., Aveyard, M., Axt, J. R., Babalola, M. T., Bahník, Š., Batra, R., Berkics, M., Bernstein, M. J., Berry, D. R., Bialobrzeska, O., Binan, E. D., Bocian, K., Brandt, M. J., Busching, R., … Nosek, B. A. (2018). Many Labs 2: Investigating Variation in Replicability Across Samples and Settings. Advances in Methods and Practices in Psychological Science, 1(4), 443–490. https://doi.org/10.1177/2515245918810225
Knudson, D. (2017). Confidence crisis of results in biomechanics research. Sports Biomechanics, 16(4), 425–433. https://doi.org/10.1080/14763141.2016.1246603
Makel, M. C., & Plucker, J. A. (2014). Facts Are More Important Than Novelty. Educational Researcher, 43(6), 304–316. https://doi.org/10.3102/0013189x14545513
Maxwell, S. E., Lau, M. Y., & Howard, G. S. (2015). Is psychology suffering from a replication crisis? What does “failure to replicate” really mean? American Psychologist, 70(6), 487–498. https://doi.org/10.1037/a0039400
Mellor, D. (2021). Improving norms in research culture to incentivize transparency and rigor. Educational Psychologist, 56(2), 122–131. https://doi.org/10.1080/00461520.2021.1902329
Mesquida, C., Murphy, J., Lakens, D., & Warne, J. (2022). Replication concerns in sports and exercise science: a narrative review of selected methodological issues in the field. Royal Society Open Science, 9(12). https://doi.org/10.1098/rsos.220946
Morey, R. D., Chambers, C. D., Etchells, P. J., Harris, C. R., Hoekstra, R., Lakens, D., Lewandowsky, S., Morey, C. C., Newman, D. P., Schönbrodt, F. D., Vanpaemel, W., Wagenmakers, E.-J., & Zwaan, R. A. (2016). The Peer Reviewers’ Openness Initiative: incentivizing open research practices through peer review. Royal Society Open Science, 3(1), 150547. https://doi.org/10.1098/rsos.150547
Morin, J.-B., Samozino, P., Murata, M., Cross, M. R., & Nagahara, R. (2019). A simple method for computing sprint acceleration kinetics from running velocity data: Replication study with improved design. Journal of Biomechanics, 94, 82–87. https://doi.org/10.1016/j.jbiomech.2019.07.020
Munafo, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., Percie du Sert, N., Simonsohn, U., Wagenmakers, E.-J., Ware, J. J., & Ioannidis, J. P. A. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1(1). https://doi.org/10.1038/s41562-016-0021
Murphy, J., Mesquida, C., Caldwell, A. R., Earp, B. D., & Warne, J. P. (2022). Proposal of a Selection Protocol for Replication of Studies in Sports and Exercise Science. Sports Medicine, 53(1), 281–291. https://doi.org/10.1007/s40279-022-01749-1
Nielsen, R. O., Chapman, C. M., Louis, W. R., Stovitz, S. D., Mansournia, M. A., Windt, J., Møller, M., Parner, E. T., Hulme, A., Bertelsen, M. L., Finch, C. F., Casals, M., & Verhagen, E. (2017). Seven sins when interpreting statistics in sports injury science. British Journal of Sports Medicine, 52(22), 1410–1412. https://doi.org/10.1136/bjsports-2017-098524
Nosek, B. A., Alter, G., Banks, G. C., Borsboom, D., Bowman, S. D., Breckler, S. J., Buck, S., Chambers, C. D., Chin, G., Christensen, G., Contestabile, M., Dafoe, A., Eich, E., Freese, J., Glennerster, R., Goroff, D., Green, D. P., Hesse, B., Humphreys, M., … Yarkoni, T. (2015). Promoting an open research culture. Science, 348(6242), 1422–1425. https://doi.org/10.1126/science.aab2374
Nosek, B. A., & Errington, T. M. (2020). What is replication? PLOS Biology, 18(3), e3000691. https://doi.org/10.1371/journal.pbio.3000691
Nosek, B. A., Hardwicke, T. E., Moshontz, H., Allard, A., Corker, K. S., Dreber, A., Fidler, F., Hilgard, J., Kline Struhl, M., Nuijten, M. B., Rohrer, J. M., Romero, F., Scheel, A. M., Scherer, L. D., Schönbrodt, F. D., & Vazire, S. (2022). Replicability, Robustness, and Reproducibility in Psychological Science. Annual Review of Psychology, 73(1), 719–748. https://doi.org/10.1146/annurev-psych-020821-114157
Nosek, B. A., Spies, J. R., & Motyl, M. (2012). Scientific Utopia. Perspectives on Psychological Science, 7(6), 615–631. https://doi.org/10.1177/1745691612459058
Nuijten, M. B., Hartgerink, C. H. J., van Assen, M. A. L. M., Epskamp, S., & Wicherts, J. M. (2015). The prevalence of statistical reporting errors in psychology (1985–2013). Behavior Research Methods, 48(4), 1205–1226. https://doi.org/10.3758/s13428-015-0664-2
Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251). https://doi.org/10.1126/science.aac4716
Ostaszewski, M. (2014). In Academia.edu. https://www.academia.edu/10347757/Analysis_of_the_attitude_within_academic_and_research_cmmunities_towards_Open_Science_quantitative_survey
Pashler, H., & Wagenmakers, E. (2012). Editors’ Introduction to the Special Section on Replicability in Psychological Science. Perspectives on Psychological Science, 7(6), 528–530. https://doi.org/10.1177/1745691612465253
Pitsch, W., & Emrich, E. (2011). The frequency of doping in elite sport: Results of a replication study. International Review for the Sociology of Sport, 47(5), 559–580. https://doi.org/10.1177/1012690211413969
Piwowar, H. A., & Vision, T. J. (2013). Data reuse and the open data citation advantage. PeerJ, 1, e175. https://doi.org/10.7717/peerj.175
Powell, K. (2015). The future of the postdoc. Nature, 520(7546), 144–147. https://doi.org/10.1038/520144a
Pownall, M., Azevedo, F., König, L. M., Slack, H. R., Evans, T. R., Flack, Z., Grinschgl, S., Elsherif, M. M., Gilligan-Lee, K. A., Oliveira, C. M., Gjoneska, B., Kanadadze, T., Button, K. S., Ashcroft-Jones, S., Terry, J., Albayrak-Aydemir, N., Dechterenko, F., Alzahawi, S., Baker, B. J., … FORRT. (2022, April 8). Teaching Open and Reproducible Scholarship: A Critical Review of the Evidence Base for Current Pedagogical Methods and their Outcomes. Center for Open Science. https://doi.org/10.31222/osf.io/9e526
R Core Team. (2022). R: A Language and Environment for Statistical Computing. https://www.R-project.org/
Rosenthal, R. (1979). The file drawer problem and tolerance for null results. Psychological Bulletin, 86(3), 638–641. https://doi.org/10.1037/0033-2909.86.3.638
Ross-Hellauer, T., Deppe, A., & Schmidt, B. (2017). Survey on open peer review: Attitudes and experience amongst editors, authors and reviewers. PLOS ONE, 12(12), e0189311. https://doi.org/10.1371/journal.pone.0189311
Sainani, K. L., Borg, D. N., Caldwell, A. R., Butson, M. L., Tenan, M. S., Vickers, A. J., Vigotsky, A. D., Warmenhoven, J., Nguyen, R., Lohse, K. R., Knight, E. J., & Bargary, N. (2020). Call to increase statistical collaboration in sports science, sport and exercise medicine and sports physiotherapy. British Journal of Sports Medicine, 55(2), 118–122. https://doi.org/10.1136/bjsports-2020-102607
Sainani, K. L., & Chamari, K. (2022). Wish List for Improving the Quality of Statistics in Sport Science. International Journal of Sports Physiology and Performance, 17(5), 673–674. https://doi.org/10.1123/ijspp.2022-0023
Sainani, K. L., Lohse, K. R., Jones, P. R., & Vickers, A. (2019). Magnitude‐based Inference is not Bayesian and is not a valid method of inference. Scandinavian Journal of Medicine &Amp; Science in Sports, 29(9), 1428–1436. https://doi.org/10.1111/sms.13491
Scheel, A. M., Tiokhin, L., Isager, P. M., & Lakens, D. (2020). Why Hypothesis Testers Should Spend Less Time Testing Hypotheses. Perspectives on Psychological Science, 16(4), 744–755. https://doi.org/10.1177/1745691620966795
Schmidt, S. (2009). Shall we Really do it Again? The Powerful Concept of Replication is Neglected in the Social Sciences. Review of General Psychology, 13(2), 90–100. https://doi.org/10.1037/a0015108
Smaldino, P. E., & McElreath, R. (2016). The natural selection of bad science. Royal Society Open Science, 3(9), 160384. https://doi.org/10.1098/rsos.160384
Smith, L. M., Yu, F., & Schmid, K. K. (2021). Role of Replication Research in Biostatistics Graduate Education. Journal of Statistics and Data Science Education, 29(1), 95–104. https://doi.org/10.1080/10691898.2020.1844105
Stojmenovska, D., Bol, T., & Leopold, T. (2019). Teaching Replication to Graduate Students. Teaching Sociology, 47(4), 303–313. https://doi.org/10.1177/0092055x19867996
Stroebe, W., & Strack, F. (2014). The Alleged Crisis and the Illusion of Exact Replication. Perspectives on Psychological Science, 9(1), 59–71. https://doi.org/10.1177/1745691613514450
Turner, L., Shamseer, L., Altman, D. G., Schulz, K. F., & Moher, D. (2012). Does use of the CONSORT Statement impact the completeness of reporting of randomised controlled trials published in medical journals? A Cochrane reviewa. Systematic Reviews, 1(1). https://doi.org/10.1186/2046-4053-1-60
Twomey, R., Yingling, V., Warne, J., Schneider, C., McCrum, C., Atkins, W., Murphy, J., Romero Medina, C., Harlley, S., & Caldwell, A. (2021). Nature of Our Literature. Communications in Kinesiology, 1(3). https://doi.org/10.51224/cik.v1i3.43
Vines, Timothy H., Albert, Arianne Y. K., Andrew, Rose L., Débarre, F., Bock, Dan G., Franklin, Michelle T., Gilbert, Kimberly J., Moore, J.-S., Renaut, S., & Rennison, Diana J. (2014). The Availability of Research Data Declines Rapidly with Article Age. Current Biology, 24(1), 94–97. https://doi.org/10.1016/j.cub.2013.11.014
Vinkers, C. H., Tijdink, J. K., & Otte, W. M. (2015). Use of positive and negative words in scientific PubMed abstracts between 1974 and 2014: retrospective analysis. BMJ, h6467. https://doi.org/10.1136/bmj.h6467
Wagge, J. R., Brandt, M. J., Lazarevic, L. B., Legate, N., Christopherson, C., Wiggins, B., & Grahe, J. E. (2019). Publishing Research With Undergraduate Students via Replication Work: The Collaborative Replications and Education Project. Frontiers in Psychology, 10. https://doi.org/10.3389/fpsyg.2019.00247

  1. retesting a claim using the same data and comparable analyses as opposed to replication which uses new data; Nosek & Errington (2020)↩︎

  2. Preregistration can be done at the Open Science Framework or posted as a preprint of the protocol at SportRxiv↩︎

  3. As Gabelica et al. (2022) states, this essentially means no data is available↩︎

  4. For further discussion see Derksen & Field (2021)↩︎

  5. All question information can be found in the data repository↩︎





Communications in Kinesiology