Please login first
The Decline Effect Permeates Not Only Intelligence Research, But Psychology as a Whole: Meta-Meta-Analytic Evidence From 648 Meta-Analyses
* 1, 2 , 3 , 4 , 5 , 1, 2 , 4 , 1
1  Department of Developmental and Educational Psychology, Faculty of Psychology, University of Vienna, Austria
2  Vienna Doctoral School in Cognition, Behavior, and Neuroscience (VDS CoBeNe), University of Vienna, Austria
3  Department of Clinical and Health Psychology, Faculty of Psychology, University of Vienna, Austria
4  Department of Cognition, Emotion, and Methods in Psychology, Faculty of Psychology, University of Vienna, Austria
5  Department of Methodology and Statistics, School of Social and Behavioral Sciences, Tilburg University, Netherlands
Academic Editor: David Giofrè

Abstract:

The term decline effect means that reported effect sizes tend to decrease in strength as evidence accumulates over time, suggesting that early published findings in scientific research are often inflated. Conceptually such declines have been attributed to publication bias, low study power, and questionable research practices. However, systematic empirical evidence of declining effects in psychological science has been limited. In the present meta-meta-analysis, we examined whether these systematic declines occur within intelligence research, other psychological disciplines, and psychological science in general. Across 670 meta-analyses published in six highly visible journals in psychology (k = 62,542, N > 60 million), we found that in intelligence research, declines occurred about twice as often as increases and were substantially larger in size (average misestimations of initial vs. meta-analytical summary effects Δr = .18 vs. .08). Furthermore, initial studies associated with declining effects exhibited somewhat lower average power to detect the summary effect compared to those linked to increases (Mdn power = 52.31% vs. 59.76%). When examining psychology studies in general, virtually identical results were observed. Effect declines outnumbered increases nearly two to one and were considerably larger in strength than increases (Δr = .204 vs. .122). Moreover, original studies associated with declines showed lower power to detect the observed summary effects (M = 48.7%, Mdn = 39.4%), compared to studies linked to underestimations (M = 65.7%, Mdn = 82.7%). In all, our findings show that the decline effect is not limited to a single research domain but instead represents a pervasive challenge across psychological science, rooted in the inflation of early findings and inadequate study power.

Keywords: decline effect; meta-meta-analysis; replicability; effect misestimation; statistical power
Top