52% Yes, a signiicant crisis 3% No, there is no crisis 7% Don't know 38% Yes, a slight crisis 38% Yes, a slight crisis 1,576 RESEARCHERS SURVEYED M ore than 70% of researchers have tried and failed to reproduce another scientist's experiments, and more than half have failed to reproduce their own experiments. Those are some of the telling figures that emerged from Nature's survey of 1,576 researchers who took a brief online questionnaire on reproducibility in research. The data reveal sometimes-contradictory attitudes towards reproduc-ibility. Although 52% of those surveyed agree that there is a significant 'crisis' of reproducibility, less than 31% think that failure to reproduce published results means that the result is probably wrong, and most say that they still trust the published literature. Data on how much of the scientific literature is reproducible are rare and generally bleak. The best-known analyses, from psychology 1 and cancer biology 2 , found rates of around 40% and 10%, respectively. Our survey respondents were more optimistic: 73% said that they think that at least half of the papers in their field can be trusted, with physicists and chemists generally showing the most confidence. The results capture a confusing snapshot of attitudes around these issues, says Arturo Casadevall, a microbiologist at the Johns Hopkins Bloomberg School of Public Health in Baltimore, Maryland. "At the current time there is no consensus on what reproducibility is or should be. " But just recognizing that is a step forward, he says. "The next step may be identifying what is the problem and to get a consensus. "
Beginning January 2014, Psychological Science gave authors the opportunity to signal open data and materials if they qualified for badges that accompanied published articles. Before badges, less than 3% of Psychological Science articles reported open data. After badges, 23% reported open data, with an accelerating trend; 39% reported open data in the first half of 2015, an increase of more than an order of magnitude from baseline. There was no change over time in the low rates of data sharing among comparison journals. Moreover, reporting openness does not guarantee openness. When badges were earned, reportedly available data were more likely to be actually available, correct, usable, and complete than when badges were not earned. Open materials also increased to a weaker degree, and there was more variability among comparison journals. Badges are simple, effective signals to promote open practices and improve preservation of data and materials by using independent repositories.
Many Labs 3 is a crowdsourced project that systematically evaluated time-of-semester effects across many participant pools. See the Wiki for a table of contents of files and to download the manuscript.
The university participant pool is a key resource for behavioral research, and data quality is believed to vary over the course of the academic semester. This crowdsourced project examined time of semester variation in 10 known effects, 10 individual differences, and 3 data quality indicators over the course of the academic semester in 20 participant pools (N = 2,696) and with an online sample (N = 737). Weak time of semester effects were observed on data quality indicators, participant sex, and a few individual differences-conscientiousness, mood, and stress. However, there was little evidence for time of semester qualifying experimental or correlational effects. The generality of this evidence is unknown because only a subset of the tested effects demonstrated evidence for the original result in the whole sample. Mean characteristics of pool samples change slightly during the semester, but these data suggest that those changes are mostly irrelevant for detecting effects. Word count = 151Keywords: social psychology; cognitive psychology; replication; participant pool; individual differences; sampling effects; situational effects 4 Many Labs 3: Evaluating participant pool quality across the academic semester via replication University participant pools provide access to participants for a great deal of published behavioral research. The typical participant pool consists of undergraduates enrolled in introductory psychology courses that require students to complete some number of experiments over the course of the academic semester. Common variations might include using other courses to recruit participants or making study participation an option for extra credit rather than a pedagogical requirement. Research-intensive universities often have a highly organized participant pool with a participant management system for signing up for studies and assigning credit. Smaller or teaching-oriented institutions often have more informal participant pools that are organized ad hoc each semester or for an individual class.To avoid selection bias based on study content, most participant pools have procedures to avoid disclosing the content or purpose of individual studies during the sign-up process.However, students are usually free to choose the time during the semester that they sign up to complete the studies. This may introduce a selection bias in which data collection on different dates occurs with different kinds of participants, or in different situational circumstances (e.g., the carefree semester beginning versus the exam-stressed semester end).If participant characteristics differ across time during the academic semester, then the results of studies may be moderated by the time at which data collection occurs. Indeed, among behavioral researchers there are widespread intuitions, superstitions, and anecdotes about the "best" time to collect data in order to minimize error and maximize power. It is common, for example, to hear stories of an effect being obtained in the first part of the semester that then "d...
Data Availability Statement: All data and materials are publicly accessible at https://osf.io/rfgdw/. Also, we preregistered our study design and analysis plan. You can find the preregistration at https://osf.io/ipkea/.Funding: The authors received no specific funding for this work.Competing Interests: Brian Nosek created the badges to acknowledge open practices, and Brian Nosek and Mallory Kidwell are on a committee maintaining the badges. The badges and specifications for earning them are CC0 licensed with and in four comparison journals. We report an increase in reported data sharing of more than an order of magnitude from baseline in Psychological Science, as well as an increase in reported materials sharing, although to a weaker degree. Moreover, we show that reportedly available data and materials were more accessible, correct, usable, and complete when badges were earned. We demonstrate that badges are effective incentives that improve the openness, accessibility, and persistence of data and materials that underlie scientific research.
Recent research suggests that individuals play an active role in their own personality development. Here, we investigated lay conceptions of this volitional personality change process. In Study 1, participants (N = 602) provided open-ended descriptions of their desired personality changes as well as the strategies they were using to achieve these changes. In Study 2, participants (N = 578) completed these same measures and provided narrative descriptions of the emergence of their desires for (and previous) personality changes. Desired changes were quantified in a manner consistent with the Five-Factor Model (though desires pertinent to Openness to Experience were rare), whereas reported strategies were distinguished on the basis of cognitive and behavioral content. Desires to increase in Extraversion corresponded negatively with the use of cognitive strategies and positively with the use of behavioral strategies, whereas desires to increase in Agreeableness exhibited the opposite pattern. Finally, desires for change were typically construed as stimulated by specific events, whereas previous personality changes were attributed to shifts in social roles. Laypersons hold a diverse range of desired changes and strategies. In addition, different categories of events are recognized as catalysts of desires for (and previous) changes.
The purpose of this research is to quantitatively compare everyday situational experience around the world. Local collaborators recruited 5,447 members of college communities in 20 countries, who provided data via a Web site in 14 languages. Using the 89 items of the Riverside Situational Q‐sort (RSQ), participants described the situation they experienced the previous evening at 7:00 p.m. Correlations among the average situational profiles of each country ranged from r = .73 to r = .95; the typical situation was described as largely pleasant. Most similar were the United States/Canada; least similar were South Korea/Denmark. Japan had the most homogenous situational experience; South Korea, the least. The 15 RSQ items varying the most across countries described relatively negative aspects of situational experience; the 15 least varying items were more positive. Further analyses correlated RSQ items with national scores on six value dimensions, the Big Five traits, economic output, and population. Individualism, Neuroticism, Openness, and Gross Domestic Product yielded more significant correlations than expected by chance. Psychological research traditionally has paid more attention to the assessment of persons than of situations, a discrepancy that extends to cross‐cultural psychology. The present study demonstrates how cultures vary in situational experience in psychologically meaningful ways.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.