2018
DOI: 10.31235/osf.io/4hmb6
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015

Abstract: These first ten authors contributed equally to this work.

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

11
434
1
9

Year Published

2019
2019
2022
2022

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 283 publications
(455 citation statements)
references
References 20 publications
11
434
1
9
Order By: Relevance
“…Specifically, probability information tends to be neglected in medical decisions. Such a replication of previous findings is especially valuable in light of the recent replication crisis in the social sciences (Camerer et al, ; Open Science Collaboration, ). As a novel contribution, our results showed that the gap between medical and monetary decisions generalizes to choices for others: Choice behavior and information search in medical decisions differ from monetary decisions irrespective of who is affected by the outcome.…”
Section: Discussionmentioning
confidence: 96%
“…Specifically, probability information tends to be neglected in medical decisions. Such a replication of previous findings is especially valuable in light of the recent replication crisis in the social sciences (Camerer et al, ; Open Science Collaboration, ). As a novel contribution, our results showed that the gap between medical and monetary decisions generalizes to choices for others: Choice behavior and information search in medical decisions differ from monetary decisions irrespective of who is affected by the outcome.…”
Section: Discussionmentioning
confidence: 96%
“…Another noteworthy replication initiative is the Social Sciences Replication Project, whose collaborators aimed to replicate 21 experimental studies in the social sciences published in the prestigious journals Nature and Science between 2010 and 2015. They find a significant effect in the same direction as the original study for 13 (62%) studies, and the effect size of the replications is on average about 50% of the original effect size (Camerer et al ). Finally, while both of the above‐mentioned projects focus on results published in top journals, Maniadis, Tufano, and List () analyze replication attempts from 150 economic journals, and find a “success rate” of 42.3% among the 85 experimental replication studies in their sample.…”
Section: Dozen Thingsmentioning
confidence: 89%
“…Particularly, future studies should carefully examine the mechanisms behind the effect. Dawson et al () stemmed from a metaphoric priming approach; however, such priming research should be approached with caution, as it has generated substantial scepticism in the social psychology field due to failures to replicate (Bower, ; Camerer et al , ; Verschuere et al , ; Yong, ). For example, in an effort to replicate Dawson et al ’s () findings and other well‐known priming measures, Dianiska, Swanner, Brimbal, and Meissner () examined the influence of lexical (i.e., word scrambles related to openness concept), contextual (e.g., room decorative posters depicting open settings) and embodiment primes (e.g., interviewers’ open or closed off body postures) on information disclosure, failing to find convincing evidence of their influence.…”
Section: Discussionmentioning
confidence: 99%