The Nature of Our Literature A Registered Report on the Positive Result Rate and Reporting Practices in Kinesiology

Main Article Content

Rosie Twomey
https://orcid.org/0000-0001-8313-6656
Vanessa R. Yingling
https://orcid.org/0000-0002-7775-6223
Joe P. Warne
https://orcid.org/0000-0002-4359-8132
Christoph Schneider
https://orcid.org/0000-0003-3760-7001
Christopher McCrum
https://orcid.org/0000-0002-4927-1114
Whitley C. Atkins
Claudia Romero Medina
Sena Harlley
Aaron R. Caldwell
https://orcid.org/0000-0002-4541-6283

Abstract

Scientists rely upon an accurate scientific literature in order to build and test new theories about the natural world. In the past decade, observational studies of the scientific literature have indicated that numerous questionable research practices and poor reporting practices may be hindering scientific progress. In particular, 3 recent studies have indicated an implausibly high rate of studies with positive (i.e., hypothesis confirming) results. In sports medicine, a field closely related to kinesiology, studies that tested a hypothesis indicated support for their primary hypothesis ~70% of the time. However, a study of journals that cover the entire field of kinesiology has yet to be completed, and the quality of other reporting practices, such as clinical trial registration, has not been evaluated. In this study we retrospectively evaluated 300 original research articles from the flagship journals of North America (Medicine and Science in Sports and Exercise), Europe (European Journal of Sport Science), and Australia (Journal of Science and Medicine in Sport). The hypothesis testing rate (~64%) and positive result rate (~81%) were much lower than what has been reported in other fields (e.g., psychology), and there was only weak evidence for our hypothesis that the positive result rate exceeded 80%. However, the positive result rate is still considered unreasonably high. Additionally, most studies did not report trial registration, and rarely included accessible data indicating rather poor reporting practices. The majority of studies relied upon significance testing (~92%), but it was more concerning that a majority of studies (~82%) without a stated hypothesis still relied upon significance testing. Overall, the positive result rate in kinesiology is unacceptably high, despite being lower than other fields such as psychology, and most published manuscripts demonstrated subpar reporting practices

Metrics

Metrics Loading ...

Article Details

How to Cite
Twomey, R., Yingling, V., Warne, J., Schneider, C., McCrum, C., Atkins, W., … Caldwell, A. (2021). The Nature of Our Literature: A Registered Report on the Positive Result Rate and Reporting Practices in Kinesiology. Communications in Kinesiology, 1(3). https://doi.org/10.51224/cik.v1i3.43
Section
Kinesiology

References

Abt, G., Boreham, C., Davison, G., Jackson, R., Nevill, A., Wallace, E., & Williams, M. (2020). Power, precision, and sample size estimation in sport and exercise science research. Journal of Sports Sciences, 38(17), 1933–1935. https://doi.org/10.1080/02640414.2020.1776002

Abt, G., Boreham, C., Davison, G., Jackson, R., Wallace, E., & Williams, A. M. (2021). Registered reports in the journal of sports sciences. Journal of Sports Sciences, 39(16), 1789–1790. https://doi.org/10.1080/02640414.2021.1950974

Boekel, W., Wagenmakers, E.-J., Belay, L., Verhagen, J., Brown, S., & Forstmann, B. U. (2015). A purely confirmatory replication study of structural brain-behavior correlations. Cortex, 66, 115–133. https://doi.org/10.1016/j.cortex.2014.11.019

Borg, D. N., Bon, J. J., Sainani, K. L., Baguley, B. J., Tierney, N. J., & Drovandi, C. (2020). Comment on: ‘Moving sport and exercise science forward: A call for the adoption of more transparent research practices.’ Sports Medicine, 50(8), 1551–1553. https://doi.org/10.1007/s40279-020-01298-5

Borg, D. N., Lohse, K. R., & Sainani, K. L. (2020). Ten common statistical errors from all phases of research, and their fixes. PM&R, 12(6), 610–614. https://doi.org/10.1002/pmrj.12395

Bürkner, P.-C. (2017). Brms: An r package for bayesian multilevel models using stan. Journal of Statistical Software, 80(1). https://doi.org/10.18637/jss.v080.i01

Büttner, F., Toomey, E., McClean, S., Roe, M., & Delahunt, E. (2020). Are questionable research practices facilitating new discoveries in sport and exercise medicine? The proportion of supported hypotheses is implausibly high. British Journal of Sports Medicine. https://doi.org/10.1136/bjsports-2019-101863

Cagan, R. (2013). San francisco declaration on research assessment. Disease Models & Mechanisms. https://doi.org/10.1242/dmm.012955

Caldwell, A. R., & Vigotsky, A. D. (2020). A case against default effect sizes in sport and exercise science. PeerJ, 8, e10314. https://doi.org/10.7717/peerj.10314

Caldwell, A. R., Vigotsky, A. D., Tenan, M. S., Radel, R., Mellor, D. T., Kreutzer, A., Lahart, I. M., Mills, J. P., Boisgontier, M. P., Boardley, I., Bouza, B., Cheval, B., Chow, Z. R., Contreras, B., Dieter, B., Halperin, I., Haun, C., Knudson, D., Lahti, J., … Consortium for Transparency in Exercise Science (COTES) Collaborators. (2020). Moving sport and exercise science forward: A call for the adoption of more transparent research practices. Sports Medicine, 50(3), 449–459. https://doi.org/10.1007/s40279-019-01227-1

Chambers, C. D., Dienes, Z., McIntosh, R. D., Rotshtein, P., & Willmes, K. (2015). Registered reports: Realigning incentives in scientific publishing. Cortex, 66, A1–2. https://doi.org/10.1016/j.cortex.2015.03.022

Clinical trials. (2021). In ICMJE. http://www.icmje.org/recommendations/browse/publishing-and-editorial-issues/clinical-trial-registration.html

Dalecki, M., Gorbet, D. J., Macpherson, A., & Sergio, L. E. (2019). Sport experience is correlated with complex motor skill recovery in youth following concussion. European Journal of Sport Science, 19(9), 1257–1266. https://doi.org/10.1080/17461391.2019.1584249

Fanelli, D. (2009). How many scientists fabricate and falsify research? A systematic review and meta-analysis of survey data. PloS One, 4(5), e5738. https://doi.org/10.1371/journal.pone.0005738

Fanelli, D. (2010). “Positive” results increase down the hierarchy of the sciences. PLOS ONE, 5(4), e10068. https://doi.org/10.1371/journal.pone.0010068

Fraley, R. C., & Vazire, S. (2014). The n-pact factor: Evaluating the quality of empirical journals with respect to sample size and statistical power. PLoS ONE, 9(10), e109019. https://doi.org/10.1371/journal.pone.0109019

Gigerenzer, G. (2018). Statistical rituals: The replication delusion and how we got there. Advances in Methods and Practices in Psychological Science, 1(2), 198–218. https://doi.org/10.1177/2515245918771329

Harris, D. J., Vine, S. J., & Wilson, M. R. (2018). An external focus of attention promotes flow experience during simulated driving. European Journal of Sport Science, 19(6), 824–833. https://doi.org/10.1080/17461391.2018.1560508

Head, M. L., Holman, L., Lanfear, R., Kahn, A. T., & Jennions, M. D. (2015). The extent and consequences of p-hacking in science. PLOS Biology, 13(3), e1002106. https://doi.org/10.1371/journal.pbio.1002106

Health, D. of, & Services, H. (2016). Clinical trials registration and results information submission. Final rule. In Federal Register (No. 183; Vol. 81, p. 64981—65157). http://europepmc.org/abstract/MED/27658315

Huston, P., Edge, V., & Bernier, E. (2019). Reaping the benefits of open data in public health. Canada Communicable Disease Report, 45(10), 252–256. https://doi.org/10.14745/ccdr.v45i10a01

Ioannidis, J. P. A. (2005). Why most published research findings are false. PLOS Medicine, 2(8), e124. https://doi.org/10.1371/journal.pmed.0020124

John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science, 23(5), 524–532. https://doi.org/10.1177/0956797611430953

Knudson, D. (2017). Confidence crisis of results in biomechanics research. Sports Biomechanics, 16(4), 425–433. https://doi.org/10.1080/14763141.2016.1246603

Krleža-Jerić, K., & Lemmens, T. (2009). 7th revision of the declaration of helsinki: Good news for the transparency of clinical trials. Croatian Medical Journal, 50(2), 105–110. https://doi.org/10.3325/cmj.2009.50.105

Lakens, D. (2021). The practical alternative to the p value is the correctly used p value. Perspectives on Psychological Science, 16(3), 639–648. https://doi.org/10.1177/1745691620958012

Masouleh, S. K., Eickhoff, S. B., Hoffstaedter, F., & Genon, S. (2019). Empirical examination of the replicability of associations between brain structure and psychological variables. eLife, 8. https://doi.org/10.7554/elife.43464

McShane, B. B., Gal, D., Gelman, A., Robert, C., & Tackett, J. L. (2019). Abandon statistical significance. The American Statistician, 73(sup1), 235–245. https://doi.org/10.1080/00031305.2018.1527253

Munafò, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., Percie du Sert, N., Simonsohn, U., Wagenmakers, E.-J., Ware, J. J., & Ioannidis, J. P. A. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1(1), 1–9. https://doi.org/10.1038/s41562-016-0021

NCI-NHGRI Working Group on Replication in Association Studies. (2007). Replicating genotype–phenotype associations. Nature, 447(7145), 655–660. https://doi.org/10.1038/447655a

Nosek, B. A., & Errington, T. M. (2017). Making sense of replications. eLife, 6. https://doi.org/10.7554/elife.23383

Nosek, B. A., & Errington, T. M. (2019). What is replication? Center for Open Science. https://doi.org/10.31222/osf.io/u4g6t

Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251). https://doi.org/10.1126/science.aac4716

Prinz, F., Schlange, T., & Asadullah, K. (2011). Believe it or not: How much can we rely on published data on potential drug targets? Nature Reviews Drug Discovery, 10(9), 712–712. https://doi.org/10.1038/nrd3439-c1

Rafi, Z., & Greenland, S. (2020). Semantic and cognitive tools to aid statistical science: Replace confidence and significance by compatibility and surprise. BMC Medical Research Methodology, 20(1), 1–13. https://doi.org/10.1186/s12874-020-01105-9

Sainani, K. L., Borg, D. N., Caldwell, A. R., Butson, M. L., Tenan, M. S., Vickers, A. J., Vigotsky, A. D., Warmenhoven, J., Nguyen, R., Lohse, K. R., Knight, E. J., & Bargary, N. (2020). Call to increase statistical collaboration in sports science, sport and exercise medicine and sports physiotherapy. British Journal of Sports Medicine, 55(2), 118–122. https://doi.org/10.1136/bjsports-2020-102607

Sainani, K. L., Lohse, K. R., Jones, P. R., & Vickers, A. (2019). Magnitude‐based inference is not bayesian and is not a valid method of inference. Scandinavian Journal of Medicine & Science in Sports, 29(9), 1428–1436. https://doi.org/10.1111/sms.13491

Scheel, A. M., Schijen, M. R. M. J., & Lakens, D. (2021). An excess of positive results: Comparing the standard psychology literature with registered reports. Advances in Methods and Practices in Psychological Science, 4(2), 251524592110074. https://doi.org/10.1177/25152459211007467

Scheel, A. M., Tiokhin, L., Isager, P. M., & Lakens, D. (2020). Why hypothesis testers should spend less time testing hypotheses. Perspectives on Psychological Science, 16(4), 744–755. https://doi.org/10.1177/1745691620966795

Silberzahn, R., Uhlmann, E. L., Martin, D. P., Anselmi, P., Aust, F., Awtrey, E., Bahnik, S., Bai, F., Bannard, C., Bonnier, E., Carlsson, R., Cheung, F., Christensen, G., Clay, R., Craig, M. A., Rosa, A. D., Dam, L., Evans, M. H., Cervantes, I. F., … Nosek, B. A. (2018). Many analysts, one data set: Making transparent how variations in analytic choices affect results. Advances in Methods and Practices in Psychological Science, 1(3), 337–356. https://doi.org/10.1177/2515245917747646

Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22(11), 1359–1366. https://doi.org/10.1177/0956797611417632

Singh, B., Fairman, C. M., Christensen, J. F., Bolam, K. A., Twomey, R., Nunan, D., & Lahart, I. M. (2021). Outcome reporting bias in exercise oncology trials (OREO): A cross-sectional study. https://doi.org/10.1101/2021.03.12.21253378

Tamminen, K. A., & Poucher, Z. A. (2018). Open science in sport and exercise psychology: Review of current approaches and considerations for qualitative inquiry. Psychology of Sport and Exercise, 36, 17–28. https://doi.org/10.1016/j.psychsport.2017.12.010

Turner, B. O., Paul, E. J., Miller, M. B., & Barbey, A. K. (2018). Small sample sizes reduce the replicability of task-based fMRI studies. Communications Biology, 1(1). https://doi.org/10.1038/s42003-018-0073-z

Vigotsky, A. D., Nuckols, G. L., Heathers, J., Krieger, J., Schoenfeld, B. J., & Steele, J. (2020). Improbable data patterns in the work of barbalho et al. SportRxiv. https://doi.org/10.31236/osf.io/sg3wm

Wasserstein, R. L., & Lazar, N. A. (2016). The ASA statement on p-values: Context, process, and purpose. In The American Statistician (No. 2; Vol. 70, pp. 129–133). Informa UK Limited. https://doi.org/10.1080/00031305.2016.1154108