2016
DOI: 10.1126/science.aaf0918
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating replicability of laboratory experiments in economics

Abstract: Another social science looks at itself Experimental economists have joined the reproducibility discussion by replicating selected published experiments from two top-tier journals in economics. Camerer et al. found that two-thirds of the 18 studies examined yielded replicable estimates of effect size and direction. This proportion is somewhat lower than unaffiliated experts were willing to bet in an associated prediction market, but roughly in line with expectation… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

25
843
2
4

Year Published

2016
2016
2021
2021

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 1,001 publications
(876 citation statements)
references
References 60 publications
25
843
2
4
Order By: Relevance
“…This issue has been highlighted by recent large replication attempts in psychology and experimental economics (Open Science Collaboration, 2015;Camerer et al, 2016) as well as by recent critical reviews of oxytocin research on humans (Nave et al, 2015;Lane et al, 2016).…”
Section: Introductionmentioning
confidence: 99%
“…This issue has been highlighted by recent large replication attempts in psychology and experimental economics (Open Science Collaboration, 2015;Camerer et al, 2016) as well as by recent critical reviews of oxytocin research on humans (Nave et al, 2015;Lane et al, 2016).…”
Section: Introductionmentioning
confidence: 99%
“…A recent coordinated replication of 18 between-subject laboratory experiments from two top economics journals found that only 11 (61%) studies replicated as measured by p < .05, despite being powered at over 90% and sending study plans to original authors for verification (Camerer et al, 2016). Results for reproducing the results of nonexperimental studies using the same data and analyses 10 have been even worse, with rates below 50% (Chang & Li, 2015, 49% with original authors' assistance, 33% without; Dewald, Thursby, & Anderson, 1986, 13%;McCullough, McGeary, & Harrison, 2006, 23%).…”
Section: A Science-wide Problemmentioning
confidence: 99%
“…When Dreber's team repeated 18 economics experiments as part of a follow-up to her psychology investigation, both the prediction markets and surveys of individuals overestimated the odds of each study's reproducibility 9 . Dreber isn't sure why this happened.…”
Section: Field-testing the Futurementioning
confidence: 99%