Publication

Many analysts, one dataset: Making transparent how variations in analytical choices affect results

Silberzahn, R., Uhlmann, E. L., Martin, D. P., Anselmi, P., Aust, F., Awtrey, E., Bahnik, Š., Bai, F., Bannard, C., Bonnier, E., Carlsson, R., Cheung, F., Christensen, G., Clay, R., Craig, M. A., Dalla Rosa, A., Dam, L., Evans, M. H., Flores Cervantes, I., Fong, N., Gamez-Djokic, M., Glenz, A., Gordon-McKeon, S., Heaton, T. J., Hederos, K., Heene, M., Hofelich Mohr, A. J., Högden, F., Hui, K., Johannesson, M., Kalodimos, J., Kaszubowski, E., Kennedy, D. M., Lei, R., Lindsay, T. A., Liverani, S., Madan, C. R., Molden, D., Molleman, H., Morey, R. D., Mulder, L., Nijstad, B., Pope, N. G., Pope, B., Prenoveau, J. M., Rink, F., Robusto, E., Roderique, H., Sandberg, A., Schlüter, E., Schönbrodt, F. D., Sherman, M. F., Sommer, S. A., Sotak, K., Spain, S., Spörlein, C., Stafford, T., Stefanutti, L., Täuber, S., Ullrich, J., Vianello, M., Wagenmakers, E-J., Witkowiak, M., Yoon, S. & Nosek, B. A. 2017 (Accepted/In press) In : Advances in Methods and Practices in Psychological Science.

Research output: Scientific - peer-reviewArticle

  • Raphael Silberzahn
  • E.L. Uhlmann
  • D.P. Martin
  • P. Anselmi
  • F. Aust
  • E. Awtrey
  • Š. Bahnik
  • F. Bai
  • C. Bannard
  • E. Bonnier
  • R. Carlsson
  • F. Cheung
  • G. Christensen
  • R. Clay
  • M.A. Craig
  • A. Dalla Rosa
  • Lammertjan Dam
  • M.H. Evans
  • I. Flores Cervantes
  • N. Fong
  • M. Gamez-Djokic
  • A. Glenz
  • S. Gordon-McKeon
  • T.J. Heaton
  • K. Hederos
  • M. Heene
  • A.J. Hofelich Mohr
  • F. Högden
  • K. Hui
  • M. Johannesson
  • J. Kalodimos
  • E. Kaszubowski
  • D.M. Kennedy
  • R. Lei
  • T.A. Lindsay
  • S. Liverani
  • C.R. Madan
  • D. Molden
  • Henricus Molleman
  • R.D. Morey
  • Laetitia Mulder
  • Bernard Nijstad
  • N.G. Pope
  • B. Pope
  • J.M. Prenoveau
  • Floortje Rink
  • E. Robusto
  • H. Roderique
  • A. Sandberg
  • E. Schlüter
  • F.D. Schönbrodt
  • M.F. Sherman
  • S.A. Sommer
  • K. Sotak
  • S. Spain
  • C. Spörlein
  • T. Stafford
  • L. Stefanutti
  • Susanne Täuber
  • J. Ullrich
  • M. Vianello
  • E.-J. Wagenmakers
  • M. Witkowiak
  • S. Yoon
  • B.A. Nosek
Twenty-nine teams involving 61 analysts used the same dataset to address the same research question: whether soccer referees are more likely to give red cards to dark skin toned players than light skin toned players. Analytic approaches varied widely across teams, and estimated effect sizes ranged from 0.89 to 2.93 in odds ratio units, with a median of 1.31. Twenty teams (69%) found a statistically significant positive effect and nine teams (31%) observed a non-significant relationship. Overall 29 different analyses used 21 unique combinations of covariates. We found that neither analysts' prior beliefs about the effect, nor their level of expertise, nor peer-reviewed quality of analysis readily explained variation in analysis outcomes. This suggests that significant variation in analysis of complex data may be difficult to avoid, even by experts with honest intentions. Crowdsourcing data analysis, a strategy by which numerous research teams are recruited to simultaneously investigate the same research question, makes transparent how defensible, yet subjective analytic choices influence research results.
Original languageEnglish
JournalAdvances in Methods and Practices in Psychological Science
StateAccepted/In press - 2017

    Keywords

  • crowdsourcing science, data analysis, scientific transparency

View graph of relations

ID: 48384892