0% found this document useful (0 votes)
40 views

Mertens Et Al 2021 the Effectiveness of Nudging a Meta Analysis of Choice Architecture Interventions Across Behavioral

This meta-analysis examines the effectiveness of choice architecture interventions, or nudges, in promoting behavior change across various domains, drawing on over 200 studies and 440 effect sizes. The findings indicate that these interventions generally lead to a small to medium effect size, with food choices showing particularly strong responsiveness. The analysis also highlights significant variability in effectiveness based on the type of intervention and behavioral domain, while noting a moderate publication bias towards positive results in the literature.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views

Mertens Et Al 2021 the Effectiveness of Nudging a Meta Analysis of Choice Architecture Interventions Across Behavioral

This meta-analysis examines the effectiveness of choice architecture interventions, or nudges, in promoting behavior change across various domains, drawing on over 200 studies and 440 effect sizes. The findings indicate that these interventions generally lead to a small to medium effect size, with food choices showing particularly strong responsiveness. The analysis also highlights significant variability in effectiveness based on the type of intervention and behavioral domain, while noting a moderate publication bias towards positive results in the literature.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

SEE CORRECTION FOR THIS ARTICLE

The effectiveness of nudging: A meta-analysis of


choice architecture interventions across behavioral
domains
Stephanie Mertensa,1 , Mario Herberza,b , Ulf J. J. Hahnela,b , and Tobias Broscha,b,1
a
Swiss Center for Affective Sciences, University of Geneva, CH-1202 Geneva, Switzerland; and b Department of Psychology, University of Geneva, CH-1205
Geneva, Switzerland

Edited by Susan Fiske, Psychology Department, Princeton University, Princeton, NJ; received April 27, 2021; accepted November 24, 2021

Over the past decade, choice architecture interventions or so- assumes that people base their decisions on known and consistent
called nudges have received widespread attention from both preferences that aim to maximize the utility, or value, of their
researchers and policy makers. Built on insights from the be- actions. In determining their preferences, people are thought
havioral sciences, this class of behavioral interventions focuses to engage in an exhaustive analysis of the probabilities and
on the design of choice environments that facilitate personally potential costs and benefits of all available options to identify
and socially desirable decisions without restricting people in their which option provides the highest expected utility and is thus
freedom of choice. Drawing on more than 200 studies reporting the most favorable (3). Interventions aiming to change behavior
over 440 effect sizes (n = 2,148,439), we present a comprehensive are accordingly designed to increase the utility of the desired
analysis of the effectiveness of choice architecture interventions option, either by educating people about the existing costs and
across techniques, behavioral domains, and contextual study char- benefits of a certain behavior or by creating entirely new in-
Downloaded from https://www.pnas.org by UNIVERSITY OF AARHUS STATE LIBRARY on April 2, 2025 from IP address 185.45.22.138.

PSYCHOLOGICAL AND
acteristics. Our results show that choice architecture interventions centive structures by means of subsidies, tax credits, fines, or

COGNITIVE SCIENCES
overall promote behavior change with a small to medium effect similar economic measures. Likewise, traditional psychological
size of Cohen’s d = 0.43 (95% CI [0.38, 0.48]). In addition, we intervention approaches explain behavior as the result of a delib-
find that the effectiveness of choice architecture interventions erate decision making process that weighs and integrates internal
varies significantly as a function of technique and domain. Across representations of people’s belief structures, values, attitudes,
behavioral domains, interventions that target the organization and norms (4, 5). Interventions accordingly focus on measures
and structure of choice alternatives (decision structure) consis- such as information campaigns that aim to shift behavior through
tently outperform interventions that focus on the description of changes in people’s beliefs or attitudes (6).
alternatives (decision information) or the reinforcement of behav- Over the past years, intervention approaches informed by re-
ioral intentions (decision assistance). Food choices are particularly search in the behavioral sciences have emerged as a complement
responsive to choice architecture interventions, with effect sizes to rational agent-based approaches. They draw on an alternative
up to 2.5 times larger than those in other behavioral domains. model of decision making which acknowledges that people are
Overall, choice architecture interventions affect behavior rela- bounded in their ability to make rational decisions. Rooted in
tively independently of contextual study characteristics such as dual-process theories of cognition and information processing
the geographical location or the target population of the inter-
vention. Our analysis further reveals a moderate publication bias
toward positive results in the literature. We end with a discussion Significance
of the implications of our findings for theory and behaviorally
informed policy making. Changing individuals’ behavior is key to tackling some of
today’s most pressing societal challenges such as the COVID-19
choice architecture | nudge | behavioral insights | behavior change | pandemic or climate change. Choice architecture interventions
meta-analysis aim to nudge people toward personally and socially desir-
able behavior through the design of choice environments.

M any of today’s most pressing societal challenges such as the


successful navigation of the COVID-19 pandemic or the
mitigation of climate change call for substantial changes in in-
Although increasingly popular, little is known about the over-
all effectiveness of choice architecture interventions and the
conditions under which they facilitate behavior change. Here
dividuals’ behavior. Whereas microeconomic and psychological we quantitatively review over a decade of research, showing
approaches based on rational agent models have traditionally that choice architecture interventions successfully promote
dominated the discussion about how to achieve behavior change, behavior change across key behavioral domains, populations,
the release of Thaler and Sunstein’s book Nudge—Improving and locations. Our findings offer insights into the effects of
Decisions about Health, Wealth, and Happiness (1) widely choice architecture and provide guidelines for behaviorally
introduced a complementary intervention approach known as informed policy making.
choice architecture or nudging, which aims to change behavior by
(re)designing the physical, social, or psychological environment Author contributions: S.M., M.H., U.J.J.H., and T.B. designed research; S.M. and M.H.
in which people make decisions while preserving their freedom performed research; S.M. analyzed data; and S.M., M.H., U.J.J.H., and T.B. wrote the
of choice (2). Since the publication of the first edition of Thaler paper.
and Sunstein (1) in 2008, choice architecture interventions have The authors declare no competing interest.
seen an immense increase in popularity (Fig. 1). However, little This article is a PNAS Direct Submission.
is known about their overall effectiveness and the conditions This open access article is distributed under Creative Commons Attribution-
under which they facilitate behavior change—a gap the present NonCommercial-NoDerivatives License 4.0 (CC BY-NC-ND).
meta-analysis aims to address by analyzing the effects of the most 1
To whom correspondence may be addressed. Email: [email protected] or
widely used choice architecture techniques across key behavioral [email protected].
domains and contextual study characteristics. This article contains supporting information online at https://www.pnas.org/lookup/
Traditional microeconomic intervention approaches are often suppl/doi:10.1073/pnas.2107346118/-/DCSupplemental.
built around a rational agent model of decision making, which Published December 30, 2021.

PNAS 2022 Vol. 119 No. 1 e2107346118 https://doi.org/10.1073/pnas.2107346118 1 of 10


350

300

Number of citations 250

200

150

100

50

2008 2009 2010 2011 2012 2013 2014 2015 2016 2017 2018 2019 2020
Downloaded from https://www.pnas.org by UNIVERSITY OF AARHUS STATE LIBRARY on April 2, 2025 from IP address 185.45.22.138.

Publication year

Fig. 1. Number of citations of Thaler and Sunstein (1) between 2008 and 2020. Counts are based on citation search in Web of Science.

(7), this model recognizes that human behavior is not always based on the provision of decision information aim to facilitate
driven by the elaborate and rational thought processes assumed access to decision-relevant information by increasing its availabil-
by the rational agent model but instead often relies on automatic ity, comprehensibility, and/or personal relevance to the decision
and computationally less intensive forms of decision making that maker. One way to achieve this is to provide social reference
allow people to navigate the demands of everyday life in the information that reduces the ambiguity of a situation and helps
face of limited time, available information, and computational overcome uncertainty about appropriate behavioral responses.
power (8, 9). Boundedly rational decision makers often construct In a natural field experiment with more than 600,000 US house-
their preferences ad hoc based on cognitive shortcuts and biases, holds, for instance, Allcott (18) demonstrated the effectiveness
which makes them susceptible to supposedly irrational contextual of descriptive social norms in promoting energy conservation.
influences, such as the way in which information is presented or Specifically, the study showed that households which regularly
structured (10–12). This susceptibility to contextual factors, while received a letter comparing their own energy consumption to that
seemingly detrimental to decision making, has been identified as of similar neighbors reduced their consumption by an average
a promising lever for behavior change because it offers the op- of 2%. This effect was estimated to be equivalent to that of a
portunity to influence people’s decisions through simple changes short-term electricity price increase of 11 to 20%. Other exam-
in the so-called choice architecture that defines the physical, ples of decision information interventions include measures that
social, and psychological context in which decisions are made increase the visibility of otherwise covert information (e.g., feed-
(2). Rather than relying on education or significant economic back devices and nutrition labels; refs. 19, 20), or that translate
incentives, choice architecture interventions aim to guide people existing descriptions of choice options into more comprehensible
toward personally and socially desirable behavior by designing or relevant information (e.g., through simplifying or reframing
environments that anticipate and integrate people’s limitations in information; ref. 21).
decision making to facilitate access to decision-relevant informa- Not only do people have limited access to decision-relevant
tion, support the evaluation and comparison of available choice information, but they often refrain from engaging in the elabo-
alternatives, or reinforce previously formed behavioral intentions rate cost-benefit analyses assumed by the rational agent model to
(13) (see Table 1 for an overview of intervention techniques based evaluate and compare the expected utility of all choice options.
on choice architecture* ). Instead, they use contextual cues about the way in which choice
alternatives are organized and structured within the decision
Addressing Psychological Barriers through Choice environment to inform their behavior. Choice architecture in-
Architecture terventions built around changes in the decision structure uti-
Unlike the assumption of the rational agent model, people rarely lize this context dependency to influence behavior through the
have access to all relevant information when making a decision. arrangement of choice alternatives or the format of decision
Instead, they tend to base their decisions on information that is making. One of the most prominent examples of this intervention
directly available to them at the moment of the decision (14, 15) approach is choice default, or the preselection of an option that
and to discount or even ignore information that is too complex or is imposed if no active choice is made. In a study comparing
meaningless to them (16, 17). Choice architecture interventions organ donation policies across European countries, Johnson and
Goldstein (22) demonstrated the impact of defaults on even
highly consequential decisions, showing that in countries with
* While
presumed consent laws, which by default register individuals
alternative classification schemes of choice architecture interventions can be
found in the literature, the taxonomy used in the present meta-analysis distinguishes
as organ donors, the rate of donor registrations was nearly 60
itself through its comprehensiveness, which makes it a highly reliable categorization percentage points higher than in countries with explicit consent
tool and allows for inferences of both theoretical and practical relevance. laws, which require individuals to formally agree to becoming an

2 of 10 PNAS Mertens et al.


https://doi.org/10.1073/pnas.2107346118 The effectiveness of nudging: A meta-analysis of choice architecture interventions
across behavioral domains
Table 1. Taxonomy of choice architecture categories and intervention techniques
Psychological barrier Intervention category Intervention technique
Limited access to Decision information: increase the Translate information: adapt attributes to facilitate
decision-relevant information availability, comprehensibility, and/or processing of already available information and/or shift
personal relevance of information decision maker’s perspective
Make information visible: provide access to relevant
information
Provide social reference point: provide social normative
information to reduce situational ambiguity and behavioral
uncertainty
Limited capacity to evaluate and Decision structure: alter the utility of Change choice defaults: set no action default or prompt
compare choice options choice options through their active choice to address behavioral inertia, loss aversion,
arrangement in the decision and/or perceived endorsement
environment or the format of decision Change option-related effort: adjust physical or financial
making effort to remove friction from desirable choice option
Change range or composition of options: adapt categories or
grouping of choice options to facilitate evaluation
Change option consequences: adapt social consequences or
microincentives to address present bias, bias in probability
weighting, and/or loss aversion
Limited attention and self-control Decision assistance: facilitate Provide reminders: increase the attentional salience of
self-regulation desirable behavior to overcome inattention due to
Downloaded from https://www.pnas.org by UNIVERSITY OF AARHUS STATE LIBRARY on April 2, 2025 from IP address 185.45.22.138.

information overload

PSYCHOLOGICAL AND
COGNITIVE SCIENCES
Facilitate commitment: encourage self or public commitment
to counteract failures of self-control

organ donor. Other examples of decision structure interventions The aim of the present meta-analysis was to address these
include changes in the effort related to choosing an option (23), questions by first quantifying the overall effect of choice archi-
the range or composition of options (24), and the consequences tecture interventions on behavior and then providing a systematic
attached to options (25). comparison of choice architecture interventions across different
Even if people make a deliberate and potentially rational techniques, behavioral domains, and contextual study character-
decision to change their behavior, limited attentional capacities istics to answer 1) whether some choice architecture techniques
and a lack of self-control may prevent this decision from actually are more effective in changing behavior than others, 2) whether
translating into the desired actions, a phenomenon described as some behavioral domains are more receptive to the effects of
the intention–behavior gap (26). Choice architecture interven- choice architecture interventions than others, 3) whether choice
tions that provide measures of decision assistance aim to bridge architecture techniques differ in their effectiveness across vary-
the intention–behavior gap by reinforcing self-regulation. One ing behavioral domains, and finally, 4) whether the effectiveness
example of this intervention approach are commitment devices, of choice architecture interventions is impacted by contextual
which are designed to strengthen self-control by removing psy- study characteristics such as the location or target population
chological barriers such as procrastination and intertemporal of the intervention. Drawing on an exhaustive literature search
discounting that often stand in the way of successful behavior that yielded more than 200 published and unpublished studies,
change. Thaler and Benartzi (27) demonstrated the effective- this comprehensive analysis presents important insights into the
ness of such commitment devices in a large-scale field study effects and potential boundary conditions of choice architecture
of the Save More Tomorrow program, showing that employees interventions and provides an evidence-based guideline for se-
increased their average saving rates from 3.5 to 13.6% when lecting behaviorally informed intervention measures.
committing in advance to allocating parts of their future salary
increases toward retirement savings. If applied across the United Results
States, this program was estimated to increase the total of annual Effect Size of Choice Architecture. Our meta-analysis of 447 effect
retirement contributions by approximately $25 billion for each sizes from 212 publications (n = 2, 148, 439) revealed a statis-
1% increase in saving rates. Other examples of decision assis- tically significant effect of choice architecture interventions on
tance interventions are reminders, which affect decision making behavior, (Cohen’s d = 0.43, 95% CI [0.38, 0.48], t(333) = 16.51,
by increasing the salience of the intended behavior (28). P < 0.001) (Fig. 2). Using conventional criteria, this effect can be
classified to be of small to medium size (40). The effect size was
The Present Meta-analysis reliable across several robustness checks, including the removal
Despite the growing interest in choice architecture, only a few of influential outliers, which marginally decreased the overall size
attempts have been made to quantitatively integrate the empir- of the effect but did not change its statistical significance (d =
ical evidence on its effectiveness as a behavior change tool (29– 0.41, 95% CI [0.37, 0.46], t(331) = 17.61, P < 0.001). Additional
32). Previous studies have mostly been restricted to the analysis leave-one-out analyses at the individual effect size level and the
of a single choice architecture technique (33–35) or a specific publication level found the effect of choice architecture inter-
behavioral domain (36–39), leaving important questions unan- ventions to be robust to the exclusion of any one effect size and
swered, including how effective choice architecture interventions publication, with d ranging from 0.42 to 0.44 and all P < 0.001.
overall are in changing behavior and whether there are systematic The total heterogeneity was estimated to be τ 2 = 0.16, indicat-
differences across choice architecture techniques and behavioral ing considerable variability in the effect size of choice architec-
domains that so far may have remained undetected and that may ture interventions. More specifically, the dispersion of effect sizes
offer new insights into the psychological mechanisms that drive suggests that while the majority of choice architecture interven-
choice architecture interventions. tions will successfully promote the desired behavior change with

Mertens et al. PNAS 3 of 10


The effectiveness of nudging: A meta-analysis of choice architecture interventions https://doi.org/10.1073/pnas.2107346118
across behavioral domains
by our meta-analytic model due to the overrepresentation of
positive effect sizes in our sample.

Moderator Analyses. Supported by the high heterogeneity among


effect sizes, we next tested the extent to which the effectiveness
of choice architecture interventions was moderated by the type of
intervention, the behavioral domain in which it was implemented,
and contextual study characteristics.
Intervention category and technique. Our first analysis focused
on identifying potential differences between the effect sizes of
decision information, decision structure, and decision assistance
interventions. This analysis found that intervention category
indeed moderated the effect of choice architecture interventions
Observation

on behavior (F (2, 330) = 12.23, P < 0.001). With average effect


sizes ranging from d = 0.28 to 0.54, interventions across all
three categories were effective in inducing statistically significant
behavior change (all P < 0.001; Fig. 4). Planned contrasts
between categories, however, revealed that interventions in the
decision structure category had a stronger effect on behavior
compared to interventions in the decision information (b = 0.19,
95% CI [0.08, 0.31], t(330) = 3.26, P = 0.001) and the decision
assistance category (b = 0.26, 95% CI [0.15, 0.36], t(330) = 4.93,
P < 0.001). No difference was found in the effectiveness of
Downloaded from https://www.pnas.org by UNIVERSITY OF AARHUS STATE LIBRARY on April 2, 2025 from IP address 185.45.22.138.

decision information and decision assistance interventions


(b = −0.06, 95% CI [−0.16, 0.04], t(330) = −1.26, P = 0.21).
Including intervention category as a moderator in our meta-
analytic model marginally reduced the proportion of true to
total variability in effect sizes from I 2 = 99.52% to I 2 = 99.33%
2 2
Model estimate with prediction (I(3) = 87.18%; I(2) = 12.15%; SI Appendix, Table S3).
interval: d = 0.43*** [-0.36, 1.22] To test whether the effect sizes of the three intervention cate-
gories adequately represented differences on the underlying level
-1 0 1 2 3 4 5
of choice architecture techniques, we reran our analysis with in-
Effect size (Cohen’s d) with 95% CI tervention technique rather than category as the key moderator.
As illustrated in Fig. 4, each of the nine intervention techniques
Fig. 2. Forest plot of all effect sizes (k = 447) included in the meta-analysis
with their corresponding 95% confidence intervals. Extracted Cohen’s
was effective in inducing behavior change, with Cohen’s d ranging
d-values ranged from −0.69 to 3.08. The proportion of true to total variance from 0.23 to 0.62 (all P < 0.01). Within intervention categories,
was estimated at I2 = 99.52%. ***P < 0.001. techniques were largely consistent in their effect sizes. Between
categories, however, techniques showed in parts substantial dif-
ferences in effect sizes. In line with the previously reported
a small to large effect size, ∼15% of interventions are likely to results, techniques within the decision structure category were
backfire, i.e., reduce or even reverse the desired behavior, with consistently stronger in their effects on behavior than interven-
a small to medium effect (95% prediction interval [−0.36, 1.22]) tion techniques within the decision information or the decision
(40–42). assistance category. The observed effect size differences between

Publication Bias. Visual inspection of the relation between effect


sizes and their corresponding SEs (Fig. 3) revealed an asymmet- 0.00
ric distribution that suggested a one-tailed overrepresentation of
positive effect sizes in studies with comparatively low statistical
power (43). This finding was formally confirmed by Egger’s test 0.15
(44), which found a positive association between effect sizes and
Standard error

SEs (b = 2.10, 95% CI [1.31, 2.89], t(332) = 5.22, P < 0.001).


Together, these results point to a publication bias in the literature 0.30

that may favor the reporting of successful as opposed to un-


successful implementations of choice architecture interventions
in studies with small sample sizes. Sensitivity analyses imposing 0.45

a priori weight functions on a simplified random effects model


suggested that this one-tailed publication bias could have po-
tentially affected the estimate of our meta-analytic model (43). 0.60

Assuming a moderate one-tailed publication bias in the literature -1 0 1 2 3


attenuated the overall effect size of choice architecture inter- Effect size (Cohen’s d)
ventions by 22.5% from Cohen’s d = 0.40, 95% CI [0.36, 0.44],
τ 2 = 0.16 (SE = 0.01) to d = 0.31, τ 2 = 0.18. Assuming a severe Fig. 3. Funnel plot displaying each observation as a function of its effect
size and SE. In the absence of publication bias, observations should scatter
one-tailed publication bias attenuated the overall effect size even
symmetrically around the pooled effect size indicated by the gray vertical
further to d = 0.08, τ 2 = 0.26, however, this assumption was only line and within the boundaries of the 95% confidence intervals shaded
partially supported by the funnel plot. Although our general con- in white. The asymmetric distribution shown here indicates a one-tailed
clusion about the effects of choice architecture interventions on publication bias in the literature that favors the reporting of successful
behavior remains the same in the light of these findings, the true implementations of choice architecture interventions in studies with small
effect size of interventions is likely to be smaller than estimated sample sizes.

4 of 10 PNAS Mertens et al.


https://doi.org/10.1073/pnas.2107346118 The effectiveness of nudging: A meta-analysis of choice architecture interventions
across behavioral domains
Intervention d 95% CI domain in our meta-analytic model reduced the proportion of
true to total effect size variability from I 2 = 99.52% to I 2 =
Decision information 2 2
99.29% (I(3) = 87.34%; I(2) = 11.95%; SI Appendix, Table S3).
Translationa 0.28 [0.17, 0.39]
Study characteristics. Last, we were interested in the extent to
Visibilityb 0.32 [0.25, 0.40]
which the effect size of choice architecture interventions was
Social referencec 0.36 [0.27, 0.46]
moderated by contextual study characteristics, such as the loca-
Average effect for categoryg 0.34 [0.27, 0.42] tion of the intervention (inside vs. outside of the United States),
Decision structure the target population of the intervention (adults vs. children and
adolescents), the experimental setting in which the intervention
Defaulta,b,c,d,e,f 0.62 [0.52, 0.73]
was investigated (conventional laboratory experiment, artifactual
Effort 0.48 [0.26, 0.70]
field experiment, framed field experiment, or natural field exper-
Composition 0.44 [0.25, 0.63]
iment; ref. 45), and the year in which the data were published. As
Consequenced 0.38 [0.31, 0.46]
can be seen in Table 2, choice architecture interventions affected
Average effect for categoryg,h 0.54 [0.46, 0.62]

Decision assistance Intervention d 95% CI


Remindere 0.29 [0.21, 0.37]
Health
Commitmentf 0.23 [0.08, 0.39]
Decision information 0.26 [0.09, 0.43]
Average effect for categoryh 0.28 [0.21, 0.35] Decision structurea 0.44 [0.29, 0.59]
Decision assistancea 0.20 [0.05, 0.35]
-0.2 0 0.2 0.4 0.6 0.8
Cohen’s d with 95% CI Average effect for domainf 0.34 [0.25, 0.43]
Downloaded from https://www.pnas.org by UNIVERSITY OF AARHUS STATE LIBRARY on April 2, 2025 from IP address 185.45.22.138.

Fig. 4. Forest plot of effect sizes across categories of choice architecture

PSYCHOLOGICAL AND
Food

COGNITIVE SCIENCES
intervention techniques (see Table 1 for more detailed description of tech- Decision information 0.44 [0.19, 0.70]
niques). The position of squares on the x axis indicates the effect size of Decision structureb 0.78 [0.54, 1.01]
each respective intervention technique. Bars indicate the 95% confidence
Decision assistanceb 0.43 [0.28, 0.59]
intervals of effect sizes. The size of squares is inversely proportional to the
SE of effect sizes. Diamond shapes indicate the average effect size and Average effect for domainf,g,h,i 0.65 [0.47, 0.83]
confidence intervals of intervention categories. The solid line represents an
effect size of Cohen’s d = 0. The dotted line represents the overall effect Environment
size of choice architecture interventions, Cohen’s d = 0.43, 95% CI [0.38,
Decision information 0.40 [0.22, 0.58]
0.48]. Identical letter superscripts indicate statistically significant (P < 0.05)
Decision structurec 0.52 [0.37, 0.68]
pairwise comparisons.
Decision assistancec 0.25 [0.06, 0.43]

the decision information, the decision structure, and the decision Average effect for domain g,j
0.43 [0.33, 0.54]
assistance category were thus unlikely to be driven by a single
intervention technique but rather representative of the entire set Finance
of techniques within those categories. Decision information 0.23 [0.13, 0.33]
Behavioral domain. Following our analysis of the effectiveness Decision structure 0.33 [0.20, 0.46]
of varying types of choice architecture interventions, we Decision assistance 0.21 [0.10, 0.33]
next focused on identifying potential differences among the h,j
Average effect for domain 0.24 [0.14, 0.35]
behavioral domains in which interventions were implemented.
As illustrated in Fig. 5, effect sizes varied quite substantially
Pro-social
across domains, with Cohen’s d ranging from 0.24 to 0.65. Our
Decision information 0.37 [0.23, 0.50]
analysis confirmed that the effectiveness of interventions was
Decision structured 0.48 [0.31, 0.66]
moderated by domain, F (5, 327) = 3.64, P = 0.003. Specifically,
Decision assistanced 0.21 [0.13, 0.30]
it showed that choice architecture interventions, while generally
effective in inducing behavior change across all six domains, Average effect for domaini 0.41 [0.27, 0.54]
had a particularly strong effect on behavior in the food domain,
with d = 0.65 (95% CI [0.47, 0.83]). The smallest effects were Other
observed in the financial domain. With an average intervention Decision information 0.27 [0.20, 0.35]
effect of d = 0.24 (95% CI [0.14, 0.35]), this domain was less Decision structuree 0.41 [0.16, 0.66]
receptive to choice architecture interventions than the other Decision assistancee 0.20 [0.09, 0.31]
behavioral domains we investigated. Introducing behavioral
Average effect for domain 0.31 [0.09, 0.52]
domain as a moderator in our meta-analytic model marginally
reduced the ratio of true to total heterogeneity among effect sizes -0.2 0 0.2 0.4 0.6 0.8 1
from I 2 = 99.52% to I 2 = 99.40%(I(3) 2 2
= 91.95%; I(2) = 7.44%;
Cohen’s d with 95% CI
SI Appendix, Table S3).
Intervention category across behavioral domain. Comparing the Fig. 5. Forest plot of effect sizes across categories of choice architecture
effectiveness of decision information, decision structure, and interventions and behavioral domains. The position of squares on the x axis
decision assistance interventions across domains consistently indicates the effect size of each intervention category within a behavioral
showed interventions within the decision structure category to domain. Bars indicate the 95% confidence intervals of effect sizes. The size
of squares is inversely proportional to the SE of effect sizes. Diamond shapes
have the largest effect on behavior, with Cohen’s d ranging
indicate the overall effect size and confidence intervals of choice architec-
from 0.33 to 0.78 (Fig. 5). This result suggests that the observed ture interventions within a behavioral domain. The solid line represents an
effect size differences between the three categories of choice effect size of Cohen’s d = 0. The dotted line represents the overall effect
architecture interventions were relatively stable and independent size of choice architecture interventions, Cohen’s d = 0.43, 95% CI [0.38,
from the behavioral domain in which interventions were applied. 0.48]. Identical letter superscripts indicate statistically significant (P < 0.05)
Including the interaction of intervention category and behavioral pairwise comparisons within a behavioral domain.

Mertens et al. PNAS 5 of 10


The effectiveness of nudging: A meta-analysis of choice architecture interventions https://doi.org/10.1073/pnas.2107346118
across behavioral domains
Table 2. Parameter estimates of three-level meta-analytic models showing the overall effect size of choice architecture interventions
as well as effect sizes across categories, techniques, behavioral domains, and contextual study characteristics
Effect size
Effect k n d 95% CI Test statistic P
Random-effects model
Overall effect size 447 2, 148, 439 0.43 [0.38, 0.48] t(333) = 16.51 < 0.001
Mixed-effects models: substantive moderators
Choice architecture category F(2, 330) = 12.23 < 0.001
Decision informationa 130 913, 151 0.34 [0.27, 0.42]
Decision structurea,b 223 356, 911 0.54 [0.46, 0.62]
Decision assistanceb 94 878, 377 0.28 [0.21, 0.35]
Choice architecture technique F(8, 324) = 4.48 < 0.001
Translationc 50 52, 170 0.28 [0.17, 0.39]
Visibilityd 31 822, 026 0.32 [0.25, 0.40]
Social referencee 49 38, 955 0.36 [0.27, 0.46]
Defaultc,d,e,f,g,h 128 139, 844 0.62 [0.52, 0.73]
Effort 23 7, 985 0.48 [0.26, 0.70]
Composition 53 7, 319 0.44 [0.25, 0.63]
Consequencef 19 201, 763 0.38 [0.31, 0.46]
Reminderg 69 870, 386 0.29 [0.21, 0.37]
Commitmenth 25 7, 991 0.23 [0.08, 0.39]
Behavioral domain F(5, 327) = 3.64 0.003
Downloaded from https://www.pnas.org by UNIVERSITY OF AARHUS STATE LIBRARY on April 2, 2025 from IP address 185.45.22.138.

Healthi 84 122, 702 0.34 [0.25, 0.43]


Foodi,j,k,l 111 12, 077 0.65 [0.47, 0.83]
Environmentj,m 76 105, 345 0.43 [0.33, 0.54]
Financek,m 45 38, 730 0.24 [0.14, 0.35]
Prosociall 58 1, 041, 501 0.41 [0.27, 0.54]
Other 73 828, 084 0.31 [0.09, 0.52]
Mixed-effects models: contextual study characteristics
Location t(332) = 0.87 0.387
Outside United States 186 1, 214, 261
Inside United States 261 934, 178
Population t(332) = –0.54 0.587
Children and adolescents 27 9, 896
Adults 420 2, 138, 543
Type of experiment F(3, 330) = 0.16 0.922
Conventional laboratory 120 12, 336 0.45 [0.36, 0.55]
Artifactual field 156 48, 824 0.41 [0.24, 0.57]
Framed field 81 15, 032 0.47 [0.32, 0.61]
Natural field 90 2, 072, 247 0.41 [0.14, 0.67]
Year of publication 1982 to 2021* 2, 148, 439 t(332) = –3.56 < 0.001

k, number of effect sizes; n, sample size. Within each moderator with more than two subgroups, identical letter superscripts indicate statistically significant
(P < 0.05) pairwise comparisons between subgroups.
*Values refer to range of publication years rather than number of effect sizes.

behavior relatively independently of contextual influences since which is comparable to more traditional intervention approaches
neither location nor target population had a statistically signifi- like education campaigns or financial incentives (46–48). Our
cant impact on the effect size of interventions. In support of the findings are largely consistent with those of previous analyses that
external validity of behavioral measures, our analysis moreover investigated the effectiveness of choice architecture interven-
did not find any difference in the effect size of different types of tions in a smaller subset of the literature (e.g., refs. 29, 30, 32, 33).
experiments. Only year of publication predicted the effect of in- In their recent meta-analysis of choice architecture interventions
terventions on behavior, with more recent publications reporting across academic disciplines, Beshears and Kosowksy (30), for
smaller effect sizes than older publications. example, found that choice architecture interventions had an
average effect size of d = 0.41. Similarly, focusing on one choice
Discussion architecture technique only, Jachimowicz et al. (33) found that
Changing individuals’ behavior is key to solving some of choice defaults had an average effect size of d = 0.68, which
today’s most pressing societal challenges. However, how can is slightly higher than the effect size our analysis revealed for
this behavior change be achieved? Recently, more and more this intervention technique (d = 0.62). Our results suggest a
researchers and policy makers have approached this question somewhat higher overall effectiveness of choice architecture
through the use of choice architecture interventions. The present interventions than meta-analyses that have focused exclusively
meta-analysis integrates over a decade’s worth of research to on field experimental research (31, 37), a discrepancy that holds
shed light on the effectiveness of choice architecture and the even when accounting for differences between experimental
conditions under which it facilitates behavior change. Our results settings (45). This inconsistency in findings may in part be
show that choice architecture interventions promote behavior explained by differences in metaanalytic samples. Only 7% of
change with a small to medium effect size of Cohen’s d = 0.43, the studies analyzed by DellaVigna and Linos (31), for example,

6 of 10 PNAS Mertens et al.


https://doi.org/10.1073/pnas.2107346118 The effectiveness of nudging: A meta-analysis of choice architecture interventions
across behavioral domains
meet the strict inclusion and exclusion criteria of the present introduces a second factor that may impact their effectiveness to
meta-analysis. Among others, these criteria excluded studies change behavior: the moderating influence of individual differ-
that combined multiple choice architecture techniques. While ences. Nutrition labels, a prominent example of decision infor-
this restriction allowed us to isolate the unique effect of each mation interventions, for instance, have been shown to be more
individual intervention technique, it may conflict with the reality frequently used by consumers who are concerned about their
of field experimental research that often requires researchers to diet and overall health than consumers who do not share those
leverage the effects of several choice architecture techniques to concerns (60). By targeting only certain population segments,
address the specific behavioral challenge at hand (see Materials information and assistance-based choice architecture interven-
and Methods for details on the literature search process and inclu- tions may show an overall smaller effect size when assessed at
sion criteria). Similarly, the techniques that are available to field the population level compared to structure-based interventions,
experimental researchers may not always align with the under- which rely less on individual values and goals and may there-
lying psychological barriers to the target behavior (Table 1), de- fore have an overall larger impact across the whole population.
creasing their effectiveness in encouraging the desired behavior From a practical perspective, this suggests that policy makers
change. who wish to use choice architecture as a behavioral intervention
Not only does choice architecture facilitate behavior change, measure may need to precede decision information and deci-
but according to our results, it does so across a wide range sion assistance interventions by an assessment and analysis of
of behavioral domains, population segments, and geographical the values and goals of the target population or, alternatively,
locations. In contrast to theoretical and empirical work challeng- choose a decision structure approach in cases when a segmen-
ing its effectiveness (49–51), choice architecture constitutes a tation of the population in terms of individual differences is not
versatile intervention approach that lends itself as an effective possible.
behavior change tool across many contexts and policy areas. In summary, the higher effectiveness of decision structure
Although the present meta-analysis focuses on studies that tested interventions may potentially be explained by a combination of
the effects of choice architecture alone, the applicability of choice
Downloaded from https://www.pnas.org by UNIVERSITY OF AARHUS STATE LIBRARY on April 2, 2025 from IP address 185.45.22.138.

two factors: 1) lower demand on information processing and

PSYCHOLOGICAL AND
COGNITIVE SCIENCES
architecture is not restricted to stand-alone interventions but 2) lower susceptibility to individual differences in values and
extends to hybrid policy measures that use choice architecture as goals. Our explanation remains somewhat speculative, however,
a complement to more traditional intervention approaches (52). as empirical research especially on the cognitive processes un-
Previous research, for example, has shown that the impact of derlying choice architecture interventions is still relatively scarce
economic interventions such as taxes or financial incentives can (but see refs. 53, 56, 57). More research efforts are needed to clar-
be enhanced through choice architecture (53–55). ify the psychological mechanisms that drive the impact of choice
In addition to the overall effect size of choice architecture architecture interventions and determine their effectiveness in
interventions, our systematic comparison of interventions across changing behavior.
different techniques, behavioral domains, and contextual study Besides the effect size variations between different categories
characteristics reveals substantial variations in the effectiveness of choice architecture techniques, our results reveal considerable
of choice architecture as a behavior change tool. Most notably, we differences in the effectiveness of choice architecture interven-
find that across behavioral domains, decision structure interven- tions across behavioral domains. Specifically, we find that choice
tions that modify decision environments to address decision mak- architecture interventions had a particularly strong effect on
ers’ limited capacity to evaluate and compare choice options are behavior in the food domain, with average effect sizes up to 2.5
consistently more effective in changing behavior than decision times larger than those in the health, environmental, financial,
information interventions that address decision makers’ limited prosocial, or other behavioral domain.† A key characteristic of
access to decision-relevant information or decision assistance food choices and other food-related behaviors is the fact that
interventions that address decision makers’ limited attention they bear relatively low behavioral costs and few, if any, per-
and self-control. This relative advantage of structural choice ceived long-term consequences for the decision maker. Previ-
architecture techniques may be due to the specific psychological ous research has found that the potential impact of a decision
mechanisms that underlie the different intervention techniques can indeed moderate the effectiveness of choice architecture
or, more specifically, their demands on information processing. interventions, with techniques such as gain and loss framing
Decision information and decision assistance interventions rely having a smaller effect on behavior when the decision at hand
on relatively elaborate forms of information processing in that has a high, direct impact on the decision maker than when
the information and assistance they provide needs to be en- the decision has little to no impact (61). Consistent with this
coded and evaluated in terms of personal values and/or goals research, we observe not only the largest effect sizes of choice
to determine the overall utility of a given choice option (56). architecture interventions in the food domain but also the overall
Decision structure interventions, by contrast, often do not re- smallest effect sizes of interventions in the financial domain, a
quire this type of information processing but provide a general domain that predominantly represents decisions of high impact
utility boost for specific choice options that offers a cognitive to the decision maker. This systematic variation of effect sizes
shortcut for determining the most desirable option (57, 58). across behavioral domains suggests that when making decisions
Accordingly, decision information and decision assistance inter- that are perceived to have a substantial impact on their lives,
ventions have previously been described as attempts to facilitate people may be less prone to the influence of automatic biases
more deliberate decision making processes, whereas decision and heuristics, and thus the effects of choice architecture inter-
structure interventions have been characterized as attempts to ventions, than when making decisions of comparatively smaller
advance more automatic decision making processes (59). Deci- impact.
sion information and decision assistance interventions may thus Another characteristic of food choices that may explain the
more frequently fail to induce behavior change and show overall high effectiveness of choice architecture interventions in the food
smaller effect sizes than decision structure interventions because
they may exceed people’s cognitive limits in decision making
more often, especially in situations of high cognitive load or time †
Please note that our results are robust to the exclusion of nonretracted studies by the
pressure. Cornell Food and Brand Laboratory which has been criticized for repeated scientific
The engagement of internal value and goal representations misconduct; retracted studies by this research group were excluded from the meta-
by decision information and decision assistance interventions analysis.

Mertens et al. PNAS 7 of 10


The effectiveness of nudging: A meta-analysis of choice architecture interventions https://doi.org/10.1073/pnas.2107346118
across behavioral domains
domain is the fact that they are often driven by habits. Commonly and how the overall effect of interventions is determined by the
defined as highly automatized behavioral responses to cues in type of behavior they target (e.g., high-impact vs. low-impact
the choice environment, habits distinguish themselves from other behaviors and habitual vs. one-time decisions). In addition, we
behaviors through a particularly strong association between be- identified a moderate publication bias toward the reporting of
havior on the one hand and choice environment on the other effect sizes that support a positive effect of choice architecture
hand (62, 63). It is possible that choice architecture interventions interventions on behavior. Future research efforts should take
benefit from this association to the extent that they target the this finding into account and place special emphasis on appropri-
choice environment and thus potentially alter triggers of habit- ate sample size planning and analysis standards when evaluating
ualized, undesirable behaviors. To illustrate, previous research choice architecture interventions. Finally, given our choice to
has shown that people tend to adjust their food consumption focus our primary literature search on the terms “choice architec-
relative to portion size, meaning that they consume more when ture” and “nudge,” we recognize that the present meta-analysis
presented with large portions and less when presented with small may have failed to capture parts of the literature published before
portions (39). Here portion size acts as an environmental cue the popularization of this now widely used terminology, despite
that triggers and guides the behavioral response to eat. Choice our efforts to expand the search beyond those terms (for details
architecture interventions that target this environmental cue, for on the literature search process, see Materials and Methods). Due
example, by changing the default size of a food portion, are likely to the large increase in choice architecture research over the past
to be successful in changing the amount of food people consume decade (Fig. 1), however, the results presented here likely offer a
because they capitalize on the highly automatized association good representation of the existing evidence on the effectiveness
between portion size and food consumption. The congruence of choice architecture in changing individuals’ behavior.
between factors that trigger habitualized behaviors and factors
that are targeted by choice architecture interventions may not
only explain why interventions in our sample were so effective Conclusion
in changing food choices but more generally indicate that choice Few behavioral intervention measures have lately received as
Downloaded from https://www.pnas.org by UNIVERSITY OF AARHUS STATE LIBRARY on April 2, 2025 from IP address 185.45.22.138.

architecture interventions are an effective tool for changing in- much attention from researchers and policy makers as choice
stances of habitualized behaviors (64). This finding is particularly architecture interventions. Integrating the results of more than
relevant from a policy making perspective as habits tend to be 440 behavioral interventions, the present meta-analysis finds
relatively unresponsive to traditional intervention approaches that choice architecture is an effective and widely applicable
and are therefore generally considered to be difficult to change behavior change tool that facilitates personally and socially desir-
(62). Given that choice architecture interventions can only target able choices across behavioral domains, geographical locations,
the environmental cues that trigger habitualized responses but and populations. Our results provide insights into the overall
not the association between choice environment and behavior effectiveness of choice architecture interventions as well as sys-
per se, it should be noted though that the effects of interventions tematic effect size variations among them, revealing promising
are likely limited to the specific choice contexts in which they are directions for future research that may facilitate the development
implemented. of theories in this still new but fast-growing field of research.
While the present meta-analysis provides a comprehensive Our work also provides a comprehensive overview of the effec-
overview of the effectiveness of choice architecture as a behav- tiveness of choice architecture interventions across a wide range
ior change tool, more research is needed to complement and of intervention contexts that are representative of some of the
complete our findings. For example, our methodological focus most pressing societal challenges we are currently facing. This
on individuals as the unit of analysis excludes a large number of overview can serve as a guideline for policy makers who seek
studies that have investigated choice architecture interventions reliable, evidence-based information on the potential impact of
on broader levels, such as households, school classes, or orga- choice architecture interventions and the conditions under which
nizations, which may reduce the generalizability of our results. they promote behavior change.
Future research should target these studies specifically to add
to the current analysis. Similarly, our data show very high levels
of heterogeneity among the effect sizes of choice architecture Materials and Methods
interventions. Although the type of intervention, the behavioral The meta-analysis was conducted in accordance with guidelines for conduct-
domain in which it is applied, and contextual study characteristics ing systematic reviews (71) and conforms to the Preferred Reporting Items
for Systematic Reviews and Meta-Analyses (72) standards.
account for some of this heterogeneity (SI Appendix, Table S3),
more research is needed to identify factors that may explain the
Literature Search and Inclusion Criteria. We searched the electronic
variability in effect sizes above and beyond those investigated
databases PsycINFO, PubMed, PubPsych, and ScienceDirect using a
here. Research has recently started to reveal some of those po- combination of keywords associated with choice architecture (nudge OR
tential moderators of choice architecture interventions, including “choice architecture”) and empirical research (method* OR empiric* OR
sociodemographic factors such as income and socioeconomic procedure OR design).‡ Since the terms nudge and choice architecture
status as well as psychological factors such as domain knowl- were established only after the seminal book by Thaler and Sunstein (1),
edge, numerical ability, and attitudes (65–67). Investigating these we restricted this search to studies that were published no earlier than
moderators systematically cannot only provide a more nuanced 2008. To compensate for the potential bias this temporal restriction might
understanding of the conditions under which choice architecture introduce to the results of our meta-analysis, we identified additional
facilitates behavior change but may also help to inform the studies, including studies published before 2008, through the reference lists
of relevant review articles and a search for research reports by governmental
design and implementation of targeted interventions that take
and nongovernmental behavioral science units. To reduce the possibly
into account individual differences in the susceptibility to choice confounding effects of publication status on the estimation of effect
architecture interventions (68). Ethical considerations should sizes, we further searched for unpublished studies using the ProQuest
play a prominent role in this process to ensure that potentially Dissertations & Theses database and requesting unpublished data through
more susceptible populations, such as children or low-income academic mailing lists. The search concluded in June 2019, yielding a total
households, retain their ability to make decisions that are in their of 9,606 unique publications.
personal best interest (66, 69, 70). Based on the results of our own
moderator analyses, additional avenues for future research may
include the study of how information processing influences the

effectiveness of varying types of choice architecture interventions Search terms were adapted from Szaszi et al. (73).

8 of 10 PNAS Mertens et al.


https://doi.org/10.1073/pnas.2107346118 The effectiveness of nudging: A meta-analysis of choice architecture interventions
across behavioral domains
Given the exceptionally high heterogeneity in choice architecture re- a reduction in the interpretability of results, but also to warrant a highly
search, we restricted our meta-analysis to studies that 1) empirically tested reliable categorization of interventions in the absence of psychological
one or more choice architecture techniques using a randomized controlled outcome measures that would allow us to infer explanatory mechanisms.
experimental design, 2) had a behavioral outcome measure that was as- Using a descriptive categorization approach further allowed us to generate
sessed in a real-life or hypothetical choice situation, 3) used individuals as theoretically meaningful insights that can be easily translated into concrete
the unit of analysis, and 4) were published in English. Studies that examined recommendations for policy making. Each intervention was coded according
choice architecture in combination with other intervention measures, such to its specific technique and corresponding category. Interventions that
as significant economic incentives or education programs, were excluded combined multiple choice architecture techniques were excluded from our
from our analyses to isolate the unique effects of choice architecture inter- analyses to isolate the unique effect of each approach. Based on previous
ventions on behavior. reviews (73) and inspection of our data, we distinguished six behavioral
The final sample comprised 447 effect sizes from 212 publications with a domains: health, food, environment, finance, prosocial behavior, and other
pooled sample size of 2,148,439 participants (n ranging from 14 to 813,990). behavior. Contextual study characteristics included the type of experiment
SI Appendix, Fig. S1 illustrates the literature search and review process. All that had been conducted (conventional laboratory experiment, artifactual
meta-analytic data and analyses reported in this paper are publicly available field experiment, framed field experiment, or natural field experiment), the
on the Open Science Framework (https://osf.io/fywae/) (74). location of the intervention (inside vs. outside of the United States), the
target population of the intervention (adults vs. children and adolescents),
Effect Size Calculation and Moderator Coding. Due to the large variation in and the year in which the data were published. Interrater reliability across a
behavioral outcome measures, we calculated Cohen’s d (40) for a standard- random sample of 20% of the publications was high, with Cohen’s κ ranging
ized effect size measure of the mean difference between control and treat- from 0.76 to 1 (M = 0.87).
ment conditions. Positive Cohen’s d values were coded to reflect behavior
change in the desired direction of the intervention, whereas negative values Statistical Analysis. We estimated the overall effect of choice architecture
reflected an undesirable change in behavior. interventions using a three-level meta-analytic model with random effects
To categorize systematic differences between choice architecture inter- on the treatment and the publication level. This approach allowed us to
ventions, we coded studies for seven moderators describing the type of account for the hierarchical structure of our data due to publications that
intervention, the behavioral domain in which it was implemented, and reported multiple relevant outcome variables and/or more than one exper-
iment (75–77). To further account for dependency in sampling errors due
Downloaded from https://www.pnas.org by UNIVERSITY OF AARHUS STATE LIBRARY on April 2, 2025 from IP address 185.45.22.138.

contextual study characteristics. The type of choice architecture intervention

PSYCHOLOGICAL AND
COGNITIVE SCIENCES
was classified using a taxonomy developed by Münscher and colleagues (13), to overlapping samples (e.g., in cases where multiple treatment conditions
which distinguishes three broad categories of choice architecture: decision were compared to the same control condition), we computed cluster-robust
information, decision structure, and decision assistance. Each of these cat- SEs, confidence intervals, and statistical tests for the estimated effect sizes
egories targets a specific aspect of the choice environment, with decision (78, 79).
information interventions targeting the way in which choice alternatives To identify systematic differences between choice architecture interven-
are described (e.g., framing), decision structure interventions targeting the tions, we ran multiple moderator analyses in which we tested for the effects
way in which those choice alternatives are organized and structured (e.g., of type of intervention, behavioral domain, and study characteristics using
choice defaults), and decision assistance interventions targeting the way mixed-effects meta-analytic models with random effects on the treatment
in which decisions can be reinforced (e.g., commitment devices). With its and the publication level. All analyses were conducted in R using the
tripartite categorization framework the taxonomy is able to capture and package metafor (80).
categorize the vast majority of choice architecture interventions described Data Availability. Data have been deposited in the Open Science Frame-
in the literature, making it one of the most comprehensive classification work (https://osf.io/fywae/).
schemes of choice architecture techniques in the field (see Table 1 for
an overview). Many alternative attempts to organize and structure choice ACKNOWLEDGMENTS. This research was supported by Swiss National Sci-
architecture interventions are considered problematic because they combine ence Foundation Grant PYAPP1_160571 awarded to Tobias Brosch and Swiss
descriptive categorization approaches, which classify interventions based on Federal Office of Energy Grant SI/501597-01. It is part of the activities of
choice architecture technique, and explanatory categorization approaches, the Swiss Competence Center for Energy Research – Competence Center
for Research in Energy, Society and Transition, supported by the Swiss
which classify interventions based on underlying psychological mechanisms,
Innovation Agency (Innosuisse). The funding sources had no involvement in
within a single framework. The taxonomy we use here adopts a descriptive the preparation of the article; in the study design; in the collection, analysis,
categorization approach in that it organizes interventions exclusively in and interpretation of data; nor in the writing of the manuscript. We thank
terms of choice architecture techniques. We chose this approach to not Allegra Mulas and Laura Pagel for their assistance in data collection and
only omit common shortcomings of hybrid classification schemes, such as extraction.

1. R. H. Thaler, C. R. Sunstein, Nudge: Improving Decisions about Health, Wealth, and 16. C. K. Hsee, J. Zhang, General evaluability theory. Perspect. Psychol. Sci. 5, 343–355
Happiness (Yale University Press, 2008). (2010).
2. R. H. Thaler, C. R. Sunstein, J. P. Balz, “Choice architecture” in The Behavioral 17. A. K. Shah, D. M. Oppenheimer, Easy does it: The role of fluency in cue weighting.
Foundations of Public Policy, E. Shafir, Ed. (Princeton University Press, 2013), pp. Judgm. Decis. Mak. 2, 371–379 (2007).
428–439. 18. H. Allcott, Social norms and energy conservation. J. Public Econ. 95, 1082–1095
3. G. S. Becker, The Economic Approach to Human Behavior (University of Chicago (2011).
Press, ed. 1, 1976). 19. K. Jessoe, D. Rapson, Knowledge is (less) power: Experimental evidence from
4. I. Ajzen, The theory of planned behavior. Organ. Behav. Hum. Decis. Process. 50, residential energy use. Am. Econ. Rev. 104, 1417–1438 (2014).
179–211 (1991). 20. C. A. Roberto, P. D. Larsen, H. Agnew, J. Baik, K. D. Brownell, Evaluating the impact
5. P. C. Stern, Toward a coherent theory of environmentally significant behavior. J. Soc. of menu labeling on food choices and intake. Am. J. Public Health 100, 312–318
Issues 56, 407–424 (2000). (2010).
21. R. P. Larrick, J. B. Soll, Economics. The MPG illusion. Science 320, 1593–1594 (2008).
6. D. Albarracin, S. Shavitt, Attitudes and attitude change. Annu. Rev. Psychol. 69, 299–
22. E. J. Johnson, D. Goldstein, Medicine. Do defaults save lives? Science 302, 1338–1339
327 (2018).
(2003).
7. J. S. B. T. Evans, Dual-processing accounts of reasoning, judgment, and social
23. J. Maas, D. T. D. de Ridder, E. de Vet, J. B. F. de Wit, Do distant foods decrease intake?
cognition. Annu. Rev. Psychol. 59, 255–278 (2008).
The effect of food accessibility on consumption. Psychol. Health 27 (suppl. 2), 59–73
8. H. A. Simon, A behavioral model of rational choice. Q. J. Econ. 69, 99–118 (1955).
(2012).
9. H. A. Simon, Models of Bounded Rationality (MIT Press, 1982).
24. J. M. Martin, M. I. Norton, Shaping online consumer choice by partitioning the web.
10. G. Gigerenzer, W. Gaissmaier, Heuristic decision making. Annu. Rev. Psychol. 62, 451– Psychol. Mark. 26, 908–926 (2009).
482 (2011). 25. M. A. Sharif, S. B. Shu, Nudging persistence after failure through emergency
11. S. Lichtenstein, P. Slovic, Eds., The Construction of Preference (Cambridge University reserves. Organ. Behav. Hum. Decis. Process. 163, 17–29 (2021).
Press, 2006). 26. P. Sheeran, T. L. Webb, The intention-behavior gap. Soc. Personal. Psychol. Compass
12. J. W. Payne, J. R. Bettman, E. J. Johnson, Behavioral decision research: A constructive 10, 503–518 (2016).
processing perspective. Annu. Rev. Psychol. 43, 87–131 (1992). 27. R. H. Thaler, S. Benartzi, Save more tomorrow: Using behavioral economics to
13. R. Münscher, M. Vetter, T. Scheuerle, A review and taxonomy of choice architecture increase employee saving. J. Polit. Econ. 112, 164–187 (2004).
techniques. J. Behav. Decis. Making 29, 511–524 (2016). 28. C. Loibl, L. Jones, E. Haisley, Testing strategies to increase saving in individual
14. D. Kahneman, Thinking, Fast and Slow (Farrar, Straus and Giroux, 2011). development account programs. J. Econ. Psychol. 66, 45–63 (2018).
15. P. Slovic, From Shakespeare to Simon: Speculations—and some evidence—about 29. S. Benartzi et al., Should governments invest more in nudging? Psychol. Sci. 28,
man’s ability to process information. Or. Res. Inst. Res. Bull. 12, 1–19 (1972). 1041–1055 (2017).

Mertens et al. PNAS 9 of 10


The effectiveness of nudging: A meta-analysis of choice architecture interventions https://doi.org/10.1073/pnas.2107346118
across behavioral domains
30. J. Beshears, H. Kosowsky, Nudging: Progress to date and future directions. Organ. 57. I. Dinner, E. J. Johnson, D. G. Goldstein, K. Liu, Partitioning default effects: Why
Behav. Hum. Decis. Process. 161 (suppl.), 3–19 (2020). people choose not to choose. J. Exp. Psychol. Appl. 17, 332–341 (2011).
31. S. DellaVigna, E. Linos, “RCTs to scale: Comprehensive evidence from two nudge 58. D. Knowles, K. Brown, S. Aldrovandi, Exploring the underpinning mechanisms of
units” (Working Paper 27594, National Bureau of Economic Research, 2020; the proximity effect within a competitive food environment. Appetite 134, 94–102
https://www.nber.org/papers/w27594). (2019).
32. D. Hummel, A. Maedche, How effective is nudging? A quantitative review on the 59. C. R. Sunstein, People prefer System 2 nudges (kind of). Duke Law J. 66, 121–168
effect sizes and limits of empirical nudging studies. J. Behav. Exp. Econ. 80, 47–58
(2016).
(2019).
60. S. Campos, J. Doxey, D. Hammond, Nutrition labels on pre-packaged foods: A
33. J. M. Jachimowicz, S. Duncan, E. U. Weber, E. J. Johnson, When and why defaults
systematic review. Public Health Nutr. 14, 1496–1506 (2011).
influence decisions: A meta-analysis of default effects. Behav. Public Policy 3, 159–
186 (2019). 61. T. M. Marteau, Framing of information: Its influence upon decisions of doctors and
34. A. N. Kluger, A. DeNisi, The effects of feedback interventions on performance: A patients. Br. J. Soc. Psychol. 28, 89–94 (1989).
historical review, a meta-analysis, and a preliminary feedback intervention theory. 62. B. Verplanken, W. Wood, Interventions to break and create consumer habits. J.
Psychol. Bull. 119, 254–284 (1996). Public Policy Mark. 25, 90–103 (2006).
35. A. Kühberger, The influence of framing on risky decision: A meta-analysis. Organ. 63. W. Wood, D. Rünger, Psychology of habit. Annu. Rev. Psychol. 67, 289–314 (2016).
Behav. Hum. Decis. Process. 75, 23–55 (1998). 64. T. A. G. Venema, F. M. Kroese, B. Verplanken, D. T. D. de Ridder, The (bitter) sweet
36. W. Abrahamse, L. Steg, C. Vlek, T. Rothengatter, A review of intervention studies taste of nudge effectiveness: The role of habits in a portion size nudge, a proof of
aimed at household energy conservation. J. Environ. Psychol. 25, 273–291 (2005). concept study. Appetite 151, 104699 (2020).
37. R. Cadario, P. Chandon, Which healthy eating nudges work best? A meta-analysis 65. H. Allcott, Site selection bias in program evaluation. Q. J. Econ. 130, 1117–1165
of field experiments. Mark. Sci. 39, 459–486 (2020). (2015).
38. C. F. Nisa, J. J. Bélanger, B. M. Schumpe, D. G. Faller, Meta-analysis of randomised 66. C. Ghesla, M. Grieder, R. Schubert, Nudging the poor and the rich—A field study
controlled trials testing behavioural interventions to promote household action on
on the distributional effects of green electricity defaults. Energy Econ. 86, 104616
climate change. Nat. Commun. 10, 4545 (2019).
(2020).
39. N. Zlatevska, C. Dubelaar, S. S. Holden, Sizing up the effect of portion size on
67. K. Mrkva, N. A. Posner, C. Reeck, E. J. Johnson, Do nudges reduce disparities? Choice
consumption: A meta-analytic review. J. Mark. 78, 140–154 (2014).
40. J. Cohen, Statistical Power Analysis for the Behavioral Sciences (Lawrence Erlbaum architecture compensates for low consumer knowledge. J. Mark. 85, 67–84 (2021).
Associates, 1988). 68. C. J. Bryan, E. Tipton, D. S. Yeager, Behavioural science is unlikely to change the
41. M. Borenstein, L. V. Hedges, J. P. Higgins, H. R. Rothstein, Identifying and Quantify- world without a heterogeneity revolution. Nat. Hum. Behav. 5, 980–989 (2021).
ing Heterogeneity (John Wiley & Sons, Ltd, 2009), pp. 107–125. 69. U. J. J. Hahnel, G. Chatelain, B. Conte, V. Piana, T. Brosch, Mental accounting
Downloaded from https://www.pnas.org by UNIVERSITY OF AARHUS STATE LIBRARY on April 2, 2025 from IP address 185.45.22.138.

42. M. Borenstein, J. P. Higgins, L. V. Hedges, H. R. Rothstein, Basics of meta-analysis: I2 mechanisms in energy decision-making and behaviour. Nat. Energy 5, 952–958
is not an absolute measure of heterogeneity. Res. Synth. Methods 8, 5–18 (2017). (2020).
43. J. L. Vevea, C. M. Woods, Publication bias in research synthesis: Sensitivity analysis 70. C. R. Sunstein, The distributional effects of nudges. Nat. Hum. Behav.
using a priori weight functions. Psychol. Methods 10, 428–443 (2005). 10.1038/s41562-021-01236-z (2021).
44. M. Egger, G. Davey Smith, M. Schneider, C. Minder, Bias in meta-analysis detected 71. A. P. Siddaway, A. M. Wood, L. V. Hedges, How to do a systematic review: A best
by a simple, graphical test. BMJ 315, 629–634 (1997). practice guide for conducting and reporting narrative reviews, meta-analyses, and
45. G. W. Harrison, J. A. List, Field experiments. J. Econ. Lit. 42, 1009–1055 (2004). meta-syntheses. Annu. Rev. Psychol. 70, 747–770 (2019).
46. A. Maki, R. J. Burns, L. Ha, A. J. Rothman, Paying people to protect the environment:
72. D. Moher, A. Liberati, J. Tetzlaff, D. G. Altman; PRISMA Group, Preferred reporting
A meta-analysis of financial incentive interventions to promote proenvironmental
items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Med.
behaviors. J. Environ. Psychol. 47, 242–255 (2016).
6, e1000097 (2009).
47. E. Mantzari et al., Personal financial incentives for changing habitual health-related
behaviors: A systematic review and meta-analysis. Prev. Med. 75, 75–85 (2015). 73. B. Szaszi, A. Palinkas, B. Palfi, A. Szollosi, B. Aczel, A systematic scoping review of
48. L. B. Snyder et al., A meta-analysis of the effect of mediated health communication the choice architecture movement: Toward understanding when and why nudges
campaigns on behavior change in the United States. J. Health Commun. 9 (suppl. work. J. Behav. Decis. Making 31, 355–366 (2018).
1), 71–96 (2004). 74. S. Mertens, M. Herberz, U. J. J. Hahnel, T. Brosch, The effectiveness of nudging: A
49. D. Hagmann, E. H. Ho, G. Loewenstein, Nudging out support for carbon tax. Nat. meta-analysis of choice architecture interventions across behavioral domains. Open
Clim. Chang. 9, 484–489 (2019). Science Framework. https://osf.io/fywae/. Deposited 11 September 2021.
50. H. IJzerman et al., Use caution when applying behavioural science to policy. Nat. 75. M. W. L. Cheung, Modeling dependent effect sizes with three-level meta-analyses:
Hum. Behav. 4, 1092–1094 (2020). A structural equation modeling approach. Psychol. Methods 19, 211–229 (2014).
51. A. S. Kristal, A. V. Whillans, What we can learn from five naturalistic field experi- 76. W. Van den Noortgate, J. A. López-López, F. Marín-Martínez, J. Sánchez-Meca,
ments that failed to shift commuter behaviour. Nat. Hum. Behav. 4, 169–176 (2020). Three-level meta-analysis of dependent effect sizes. Behav. Res. Methods 45, 576–
52. G. Loewenstein, N. Chater, Putting nudges in perspective. Behav. Public Policy 1, 594 (2013).
26–53 (2017).
77. W. Van den Noortgate, J. A. López-López, F. Marín-Martínez, J. Sánchez-Meca, Meta-
53. D. J. Hardisty, E. J. Johnson, E. U. Weber, A dirty word or a dirty world?: Attribute
analysis of multiple outcomes: A multilevel approach. Behav. Res. Methods 47, 1274–
framing, political affiliation, and query theory. Psychol. Sci. 21, 86–92 (2010).
1294 (2015).
54. T. A. Homonoff, Can small incentives have large effects? The impact of taxes versus
bonuses on disposable bag use. Am. Econ. J. Econ. Policy 10, 177–210 (2018). 78. A. C. Cameron, D. L. Miller, A practitioner’s guide to cluster-robust inference. J. Hum.
55. E. J. McCaffery, J. Baron, Thinking about tax. Psychol. Public Policy Law 12, 106–135 Resour. 50, 317–372 (2015).
(2006). 79. L. V. Hedges, E. Tipton, M. C. Johnson, Robust variance estimation in meta-regression
56. S. Mertens, U. J. J. Hahnel, T. Brosch, This way please: Uncovering the directional with dependent effect size estimates. Res. Synth. Methods 1, 39–65 (2010).
effects of attribute translations on decision making. Judgm. Decis. Mak. 15, 25–46 80. W. Viechtbauer, Conducting meta-analyses in R with the metafor package. J. Stat.
(2020). Softw. 36, 1–48 (2010).

10 of 10 PNAS Mertens et al.


https://doi.org/10.1073/pnas.2107346118 The effectiveness of nudging: A meta-analysis of choice architecture interventions
across behavioral domains

You might also like