Citations

Article
Full-text available
Two meta-analyses examined the effects of growth mindset interventions. Burnette et al. (2023) tested two moderators and found that effects ranged from negative to positive. We (Macnamara & Burgoyne, 2023) tested 11 preregistered moderators and examined the evidence according to a well-defined set of best practices. We found major areas of concern in the growth mindset intervention literature. For instance, 94% of growth mindset interventions included confounds, authors with a known financial incentive were two and a half times as likely to report positive effects, and higher quality studies were less likely to demonstrate a benefit. Yan and Schuetze (2023) contextualized these findings by describing problems with mindset theory and its measurement. Likewise, Oyserman (2023) discussed how growth mindset is a culturally fluent idea; papers supportive of growth mindset are widely embraced, whereas papers taking a skeptical approach are challenged. In another commentary, Tipton et al. (2023) challenged our results, claiming to produce positive effects by reanalyzing our data set using Burnette et al.’s (2023) approach. However, in addition to changing the approach, Tipton et al. changed effect sizes, how moderators were coded, and which studies were included, often without explanation. Though we appreciate the discussion of multiple meta-analytic approaches, we contend that meta-analytic decisions should be a priori, transparently reported, and consistently applied. Tipton et al.’s analysis illustrated our (Macnamara & Burgoyne’s, 2023) conclusion: Apparent effects of growth mindset interventions on academic achievement may be attributable to inadequate study design, reporting flaws, and bias.
Preprint
Full-text available
According to mindset theory, students who believe their personal characteristics can change—that is, those who hold a growth mindset—will achieve more than students who believe their characteristics are fixed. Proponents of the theory have developed interventions to influence students’ mindsets, claiming that these interventions lead to large gains in academic achievement. Despite their popularity, the evidence for growth mindset intervention benefits has not been systematically evaluated considering both the quantity and quality of the evidence. Here, we provide such a review by (a) evaluating empirical studies’ adherence to a set of best practices essential for drawing causal conclusions and (b) conducting three meta-analyses. When examining all studies (63 studies, N = 97,672), we found major shortcomings in study design, analysis, and reporting, and suggestions of researcher and publication bias: Authors with a financial incentive to report positive findings published significantly larger effects than authors without this incentive. Across all studies, we observed a small overall effect: d ̅ = 0.05, 95% CI = [0.02, 0.09], which was non-significant after correcting for potential publication bias. No theoretically-meaningful moderators were significant. When examining only studies demonstrating the intervention influenced students’ mindsets as intended (13 studies, N = 18,355), the effect was non-significant: d ̅ = 0.04, 95% CI = [-0.01, 0.10]. When examining the highest-quality evidence (6 studies, N = 13,571), the effect was non-significant: d ̅ = 0.02, 95% CI = [-0.06, 0.10]. We conclude that apparent effects of growth mindset interventions on academic achievement are likely attributable to inadequate study design, reporting flaws, and bias.
Article
Full-text available
According to mindset theory, students who believe their personal characteristics can change—that is, those who hold a growth mindset—will achieve more than students who believe their characteristics are fixed. Proponents of the theory have developed interventions to influence students’ mindsets, claiming that these interventions lead to large gains in academic achievement. Despite their popularity, the evidence for growth mindset intervention benefits has not been systematically evaluated considering both the quantity and quality of the evidence. Here, we provide such a review by (a) evaluating empirical studies’ adherence to a set of best practices essential for drawing causal conclusions and (b) conducting three meta-analyses. When examining all studies (63 studies, N = 97,672), we found major shortcomings in study design, analysis, and reporting, and suggestions of researcher and publication bias: Authors with a financial incentive to report positive findings published significantly larger effects than authors without this incentive. Across all studies, we observed a small overall effect: d¯ = 0.05, 95% CI = [0.02, 0.09], which was nonsignificant after correcting for potential publication bias. No theoretically meaningful moderators were significant. When examining only studies demonstrating the intervention influenced students’ mindsets as intended (13 studies, N = 18,355), the effect was nonsignificant: d¯ = 0.04, 95% CI = [−0.01, 0.10]. When examining the highest-quality evidence (6 studies, N = 13,571), the effect was nonsignificant: d¯ = 0.02, 95% CI = [−0.06, 0.10]. We conclude that apparent effects of growth mindset interventions on academic achievement are likely attributable to inadequate study design, reporting flaws, and bias.