Mertens Et Al 2021 the Effectiveness of Nudging a Meta Analysis of Choice Architecture Interventions Across Behavioral
Mertens Et Al 2021 the Effectiveness of Nudging a Meta Analysis of Choice Architecture Interventions Across Behavioral
Edited by Susan Fiske, Psychology Department, Princeton University, Princeton, NJ; received April 27, 2021; accepted November 24, 2021
Over the past decade, choice architecture interventions or so- assumes that people base their decisions on known and consistent
called nudges have received widespread attention from both preferences that aim to maximize the utility, or value, of their
researchers and policy makers. Built on insights from the be- actions. In determining their preferences, people are thought
havioral sciences, this class of behavioral interventions focuses to engage in an exhaustive analysis of the probabilities and
on the design of choice environments that facilitate personally potential costs and benefits of all available options to identify
and socially desirable decisions without restricting people in their which option provides the highest expected utility and is thus
freedom of choice. Drawing on more than 200 studies reporting the most favorable (3). Interventions aiming to change behavior
over 440 effect sizes (n = 2,148,439), we present a comprehensive are accordingly designed to increase the utility of the desired
analysis of the effectiveness of choice architecture interventions option, either by educating people about the existing costs and
across techniques, behavioral domains, and contextual study char- benefits of a certain behavior or by creating entirely new in-
Downloaded from https://www.pnas.org by UNIVERSITY OF AARHUS STATE LIBRARY on April 2, 2025 from IP address 185.45.22.138.
PSYCHOLOGICAL AND
acteristics. Our results show that choice architecture interventions centive structures by means of subsidies, tax credits, fines, or
COGNITIVE SCIENCES
overall promote behavior change with a small to medium effect similar economic measures. Likewise, traditional psychological
size of Cohen’s d = 0.43 (95% CI [0.38, 0.48]). In addition, we intervention approaches explain behavior as the result of a delib-
find that the effectiveness of choice architecture interventions erate decision making process that weighs and integrates internal
varies significantly as a function of technique and domain. Across representations of people’s belief structures, values, attitudes,
behavioral domains, interventions that target the organization and norms (4, 5). Interventions accordingly focus on measures
and structure of choice alternatives (decision structure) consis- such as information campaigns that aim to shift behavior through
tently outperform interventions that focus on the description of changes in people’s beliefs or attitudes (6).
alternatives (decision information) or the reinforcement of behav- Over the past years, intervention approaches informed by re-
ioral intentions (decision assistance). Food choices are particularly search in the behavioral sciences have emerged as a complement
responsive to choice architecture interventions, with effect sizes to rational agent-based approaches. They draw on an alternative
up to 2.5 times larger than those in other behavioral domains. model of decision making which acknowledges that people are
Overall, choice architecture interventions affect behavior rela- bounded in their ability to make rational decisions. Rooted in
tively independently of contextual study characteristics such as dual-process theories of cognition and information processing
the geographical location or the target population of the inter-
vention. Our analysis further reveals a moderate publication bias
toward positive results in the literature. We end with a discussion Significance
of the implications of our findings for theory and behaviorally
informed policy making. Changing individuals’ behavior is key to tackling some of
today’s most pressing societal challenges such as the COVID-19
choice architecture | nudge | behavioral insights | behavior change | pandemic or climate change. Choice architecture interventions
meta-analysis aim to nudge people toward personally and socially desir-
able behavior through the design of choice environments.
300
200
150
100
50
2008 2009 2010 2011 2012 2013 2014 2015 2016 2017 2018 2019 2020
Downloaded from https://www.pnas.org by UNIVERSITY OF AARHUS STATE LIBRARY on April 2, 2025 from IP address 185.45.22.138.
Publication year
Fig. 1. Number of citations of Thaler and Sunstein (1) between 2008 and 2020. Counts are based on citation search in Web of Science.
(7), this model recognizes that human behavior is not always based on the provision of decision information aim to facilitate
driven by the elaborate and rational thought processes assumed access to decision-relevant information by increasing its availabil-
by the rational agent model but instead often relies on automatic ity, comprehensibility, and/or personal relevance to the decision
and computationally less intensive forms of decision making that maker. One way to achieve this is to provide social reference
allow people to navigate the demands of everyday life in the information that reduces the ambiguity of a situation and helps
face of limited time, available information, and computational overcome uncertainty about appropriate behavioral responses.
power (8, 9). Boundedly rational decision makers often construct In a natural field experiment with more than 600,000 US house-
their preferences ad hoc based on cognitive shortcuts and biases, holds, for instance, Allcott (18) demonstrated the effectiveness
which makes them susceptible to supposedly irrational contextual of descriptive social norms in promoting energy conservation.
influences, such as the way in which information is presented or Specifically, the study showed that households which regularly
structured (10–12). This susceptibility to contextual factors, while received a letter comparing their own energy consumption to that
seemingly detrimental to decision making, has been identified as of similar neighbors reduced their consumption by an average
a promising lever for behavior change because it offers the op- of 2%. This effect was estimated to be equivalent to that of a
portunity to influence people’s decisions through simple changes short-term electricity price increase of 11 to 20%. Other exam-
in the so-called choice architecture that defines the physical, ples of decision information interventions include measures that
social, and psychological context in which decisions are made increase the visibility of otherwise covert information (e.g., feed-
(2). Rather than relying on education or significant economic back devices and nutrition labels; refs. 19, 20), or that translate
incentives, choice architecture interventions aim to guide people existing descriptions of choice options into more comprehensible
toward personally and socially desirable behavior by designing or relevant information (e.g., through simplifying or reframing
environments that anticipate and integrate people’s limitations in information; ref. 21).
decision making to facilitate access to decision-relevant informa- Not only do people have limited access to decision-relevant
tion, support the evaluation and comparison of available choice information, but they often refrain from engaging in the elabo-
alternatives, or reinforce previously formed behavioral intentions rate cost-benefit analyses assumed by the rational agent model to
(13) (see Table 1 for an overview of intervention techniques based evaluate and compare the expected utility of all choice options.
on choice architecture* ). Instead, they use contextual cues about the way in which choice
alternatives are organized and structured within the decision
Addressing Psychological Barriers through Choice environment to inform their behavior. Choice architecture in-
Architecture terventions built around changes in the decision structure uti-
Unlike the assumption of the rational agent model, people rarely lize this context dependency to influence behavior through the
have access to all relevant information when making a decision. arrangement of choice alternatives or the format of decision
Instead, they tend to base their decisions on information that is making. One of the most prominent examples of this intervention
directly available to them at the moment of the decision (14, 15) approach is choice default, or the preselection of an option that
and to discount or even ignore information that is too complex or is imposed if no active choice is made. In a study comparing
meaningless to them (16, 17). Choice architecture interventions organ donation policies across European countries, Johnson and
Goldstein (22) demonstrated the impact of defaults on even
highly consequential decisions, showing that in countries with
* While
presumed consent laws, which by default register individuals
alternative classification schemes of choice architecture interventions can be
found in the literature, the taxonomy used in the present meta-analysis distinguishes
as organ donors, the rate of donor registrations was nearly 60
itself through its comprehensiveness, which makes it a highly reliable categorization percentage points higher than in countries with explicit consent
tool and allows for inferences of both theoretical and practical relevance. laws, which require individuals to formally agree to becoming an
information overload
PSYCHOLOGICAL AND
COGNITIVE SCIENCES
Facilitate commitment: encourage self or public commitment
to counteract failures of self-control
organ donor. Other examples of decision structure interventions The aim of the present meta-analysis was to address these
include changes in the effort related to choosing an option (23), questions by first quantifying the overall effect of choice archi-
the range or composition of options (24), and the consequences tecture interventions on behavior and then providing a systematic
attached to options (25). comparison of choice architecture interventions across different
Even if people make a deliberate and potentially rational techniques, behavioral domains, and contextual study character-
decision to change their behavior, limited attentional capacities istics to answer 1) whether some choice architecture techniques
and a lack of self-control may prevent this decision from actually are more effective in changing behavior than others, 2) whether
translating into the desired actions, a phenomenon described as some behavioral domains are more receptive to the effects of
the intention–behavior gap (26). Choice architecture interven- choice architecture interventions than others, 3) whether choice
tions that provide measures of decision assistance aim to bridge architecture techniques differ in their effectiveness across vary-
the intention–behavior gap by reinforcing self-regulation. One ing behavioral domains, and finally, 4) whether the effectiveness
example of this intervention approach are commitment devices, of choice architecture interventions is impacted by contextual
which are designed to strengthen self-control by removing psy- study characteristics such as the location or target population
chological barriers such as procrastination and intertemporal of the intervention. Drawing on an exhaustive literature search
discounting that often stand in the way of successful behavior that yielded more than 200 published and unpublished studies,
change. Thaler and Benartzi (27) demonstrated the effective- this comprehensive analysis presents important insights into the
ness of such commitment devices in a large-scale field study effects and potential boundary conditions of choice architecture
of the Save More Tomorrow program, showing that employees interventions and provides an evidence-based guideline for se-
increased their average saving rates from 3.5 to 13.6% when lecting behaviorally informed intervention measures.
committing in advance to allocating parts of their future salary
increases toward retirement savings. If applied across the United Results
States, this program was estimated to increase the total of annual Effect Size of Choice Architecture. Our meta-analysis of 447 effect
retirement contributions by approximately $25 billion for each sizes from 212 publications (n = 2, 148, 439) revealed a statis-
1% increase in saving rates. Other examples of decision assis- tically significant effect of choice architecture interventions on
tance interventions are reminders, which affect decision making behavior, (Cohen’s d = 0.43, 95% CI [0.38, 0.48], t(333) = 16.51,
by increasing the salience of the intended behavior (28). P < 0.001) (Fig. 2). Using conventional criteria, this effect can be
classified to be of small to medium size (40). The effect size was
The Present Meta-analysis reliable across several robustness checks, including the removal
Despite the growing interest in choice architecture, only a few of influential outliers, which marginally decreased the overall size
attempts have been made to quantitatively integrate the empir- of the effect but did not change its statistical significance (d =
ical evidence on its effectiveness as a behavior change tool (29– 0.41, 95% CI [0.37, 0.46], t(331) = 17.61, P < 0.001). Additional
32). Previous studies have mostly been restricted to the analysis leave-one-out analyses at the individual effect size level and the
of a single choice architecture technique (33–35) or a specific publication level found the effect of choice architecture inter-
behavioral domain (36–39), leaving important questions unan- ventions to be robust to the exclusion of any one effect size and
swered, including how effective choice architecture interventions publication, with d ranging from 0.42 to 0.44 and all P < 0.001.
overall are in changing behavior and whether there are systematic The total heterogeneity was estimated to be τ 2 = 0.16, indicat-
differences across choice architecture techniques and behavioral ing considerable variability in the effect size of choice architec-
domains that so far may have remained undetected and that may ture interventions. More specifically, the dispersion of effect sizes
offer new insights into the psychological mechanisms that drive suggests that while the majority of choice architecture interven-
choice architecture interventions. tions will successfully promote the desired behavior change with
PSYCHOLOGICAL AND
Food
COGNITIVE SCIENCES
intervention techniques (see Table 1 for more detailed description of tech- Decision information 0.44 [0.19, 0.70]
niques). The position of squares on the x axis indicates the effect size of Decision structureb 0.78 [0.54, 1.01]
each respective intervention technique. Bars indicate the 95% confidence
Decision assistanceb 0.43 [0.28, 0.59]
intervals of effect sizes. The size of squares is inversely proportional to the
SE of effect sizes. Diamond shapes indicate the average effect size and Average effect for domainf,g,h,i 0.65 [0.47, 0.83]
confidence intervals of intervention categories. The solid line represents an
effect size of Cohen’s d = 0. The dotted line represents the overall effect Environment
size of choice architecture interventions, Cohen’s d = 0.43, 95% CI [0.38,
Decision information 0.40 [0.22, 0.58]
0.48]. Identical letter superscripts indicate statistically significant (P < 0.05)
Decision structurec 0.52 [0.37, 0.68]
pairwise comparisons.
Decision assistancec 0.25 [0.06, 0.43]
the decision information, the decision structure, and the decision Average effect for domain g,j
0.43 [0.33, 0.54]
assistance category were thus unlikely to be driven by a single
intervention technique but rather representative of the entire set Finance
of techniques within those categories. Decision information 0.23 [0.13, 0.33]
Behavioral domain. Following our analysis of the effectiveness Decision structure 0.33 [0.20, 0.46]
of varying types of choice architecture interventions, we Decision assistance 0.21 [0.10, 0.33]
next focused on identifying potential differences among the h,j
Average effect for domain 0.24 [0.14, 0.35]
behavioral domains in which interventions were implemented.
As illustrated in Fig. 5, effect sizes varied quite substantially
Pro-social
across domains, with Cohen’s d ranging from 0.24 to 0.65. Our
Decision information 0.37 [0.23, 0.50]
analysis confirmed that the effectiveness of interventions was
Decision structured 0.48 [0.31, 0.66]
moderated by domain, F (5, 327) = 3.64, P = 0.003. Specifically,
Decision assistanced 0.21 [0.13, 0.30]
it showed that choice architecture interventions, while generally
effective in inducing behavior change across all six domains, Average effect for domaini 0.41 [0.27, 0.54]
had a particularly strong effect on behavior in the food domain,
with d = 0.65 (95% CI [0.47, 0.83]). The smallest effects were Other
observed in the financial domain. With an average intervention Decision information 0.27 [0.20, 0.35]
effect of d = 0.24 (95% CI [0.14, 0.35]), this domain was less Decision structuree 0.41 [0.16, 0.66]
receptive to choice architecture interventions than the other Decision assistancee 0.20 [0.09, 0.31]
behavioral domains we investigated. Introducing behavioral
Average effect for domain 0.31 [0.09, 0.52]
domain as a moderator in our meta-analytic model marginally
reduced the ratio of true to total heterogeneity among effect sizes -0.2 0 0.2 0.4 0.6 0.8 1
from I 2 = 99.52% to I 2 = 99.40%(I(3) 2 2
= 91.95%; I(2) = 7.44%;
Cohen’s d with 95% CI
SI Appendix, Table S3).
Intervention category across behavioral domain. Comparing the Fig. 5. Forest plot of effect sizes across categories of choice architecture
effectiveness of decision information, decision structure, and interventions and behavioral domains. The position of squares on the x axis
decision assistance interventions across domains consistently indicates the effect size of each intervention category within a behavioral
showed interventions within the decision structure category to domain. Bars indicate the 95% confidence intervals of effect sizes. The size
of squares is inversely proportional to the SE of effect sizes. Diamond shapes
have the largest effect on behavior, with Cohen’s d ranging
indicate the overall effect size and confidence intervals of choice architec-
from 0.33 to 0.78 (Fig. 5). This result suggests that the observed ture interventions within a behavioral domain. The solid line represents an
effect size differences between the three categories of choice effect size of Cohen’s d = 0. The dotted line represents the overall effect
architecture interventions were relatively stable and independent size of choice architecture interventions, Cohen’s d = 0.43, 95% CI [0.38,
from the behavioral domain in which interventions were applied. 0.48]. Identical letter superscripts indicate statistically significant (P < 0.05)
Including the interaction of intervention category and behavioral pairwise comparisons within a behavioral domain.
k, number of effect sizes; n, sample size. Within each moderator with more than two subgroups, identical letter superscripts indicate statistically significant
(P < 0.05) pairwise comparisons between subgroups.
*Values refer to range of publication years rather than number of effect sizes.
behavior relatively independently of contextual influences since which is comparable to more traditional intervention approaches
neither location nor target population had a statistically signifi- like education campaigns or financial incentives (46–48). Our
cant impact on the effect size of interventions. In support of the findings are largely consistent with those of previous analyses that
external validity of behavioral measures, our analysis moreover investigated the effectiveness of choice architecture interven-
did not find any difference in the effect size of different types of tions in a smaller subset of the literature (e.g., refs. 29, 30, 32, 33).
experiments. Only year of publication predicted the effect of in- In their recent meta-analysis of choice architecture interventions
terventions on behavior, with more recent publications reporting across academic disciplines, Beshears and Kosowksy (30), for
smaller effect sizes than older publications. example, found that choice architecture interventions had an
average effect size of d = 0.41. Similarly, focusing on one choice
Discussion architecture technique only, Jachimowicz et al. (33) found that
Changing individuals’ behavior is key to solving some of choice defaults had an average effect size of d = 0.68, which
today’s most pressing societal challenges. However, how can is slightly higher than the effect size our analysis revealed for
this behavior change be achieved? Recently, more and more this intervention technique (d = 0.62). Our results suggest a
researchers and policy makers have approached this question somewhat higher overall effectiveness of choice architecture
through the use of choice architecture interventions. The present interventions than meta-analyses that have focused exclusively
meta-analysis integrates over a decade’s worth of research to on field experimental research (31, 37), a discrepancy that holds
shed light on the effectiveness of choice architecture and the even when accounting for differences between experimental
conditions under which it facilitates behavior change. Our results settings (45). This inconsistency in findings may in part be
show that choice architecture interventions promote behavior explained by differences in metaanalytic samples. Only 7% of
change with a small to medium effect size of Cohen’s d = 0.43, the studies analyzed by DellaVigna and Linos (31), for example,
PSYCHOLOGICAL AND
COGNITIVE SCIENCES
architecture is not restricted to stand-alone interventions but 2) lower susceptibility to individual differences in values and
extends to hybrid policy measures that use choice architecture as goals. Our explanation remains somewhat speculative, however,
a complement to more traditional intervention approaches (52). as empirical research especially on the cognitive processes un-
Previous research, for example, has shown that the impact of derlying choice architecture interventions is still relatively scarce
economic interventions such as taxes or financial incentives can (but see refs. 53, 56, 57). More research efforts are needed to clar-
be enhanced through choice architecture (53–55). ify the psychological mechanisms that drive the impact of choice
In addition to the overall effect size of choice architecture architecture interventions and determine their effectiveness in
interventions, our systematic comparison of interventions across changing behavior.
different techniques, behavioral domains, and contextual study Besides the effect size variations between different categories
characteristics reveals substantial variations in the effectiveness of choice architecture techniques, our results reveal considerable
of choice architecture as a behavior change tool. Most notably, we differences in the effectiveness of choice architecture interven-
find that across behavioral domains, decision structure interven- tions across behavioral domains. Specifically, we find that choice
tions that modify decision environments to address decision mak- architecture interventions had a particularly strong effect on
ers’ limited capacity to evaluate and compare choice options are behavior in the food domain, with average effect sizes up to 2.5
consistently more effective in changing behavior than decision times larger than those in the health, environmental, financial,
information interventions that address decision makers’ limited prosocial, or other behavioral domain.† A key characteristic of
access to decision-relevant information or decision assistance food choices and other food-related behaviors is the fact that
interventions that address decision makers’ limited attention they bear relatively low behavioral costs and few, if any, per-
and self-control. This relative advantage of structural choice ceived long-term consequences for the decision maker. Previ-
architecture techniques may be due to the specific psychological ous research has found that the potential impact of a decision
mechanisms that underlie the different intervention techniques can indeed moderate the effectiveness of choice architecture
or, more specifically, their demands on information processing. interventions, with techniques such as gain and loss framing
Decision information and decision assistance interventions rely having a smaller effect on behavior when the decision at hand
on relatively elaborate forms of information processing in that has a high, direct impact on the decision maker than when
the information and assistance they provide needs to be en- the decision has little to no impact (61). Consistent with this
coded and evaluated in terms of personal values and/or goals research, we observe not only the largest effect sizes of choice
to determine the overall utility of a given choice option (56). architecture interventions in the food domain but also the overall
Decision structure interventions, by contrast, often do not re- smallest effect sizes of interventions in the financial domain, a
quire this type of information processing but provide a general domain that predominantly represents decisions of high impact
utility boost for specific choice options that offers a cognitive to the decision maker. This systematic variation of effect sizes
shortcut for determining the most desirable option (57, 58). across behavioral domains suggests that when making decisions
Accordingly, decision information and decision assistance inter- that are perceived to have a substantial impact on their lives,
ventions have previously been described as attempts to facilitate people may be less prone to the influence of automatic biases
more deliberate decision making processes, whereas decision and heuristics, and thus the effects of choice architecture inter-
structure interventions have been characterized as attempts to ventions, than when making decisions of comparatively smaller
advance more automatic decision making processes (59). Deci- impact.
sion information and decision assistance interventions may thus Another characteristic of food choices that may explain the
more frequently fail to induce behavior change and show overall high effectiveness of choice architecture interventions in the food
smaller effect sizes than decision structure interventions because
they may exceed people’s cognitive limits in decision making
more often, especially in situations of high cognitive load or time †
Please note that our results are robust to the exclusion of nonretracted studies by the
pressure. Cornell Food and Brand Laboratory which has been criticized for repeated scientific
The engagement of internal value and goal representations misconduct; retracted studies by this research group were excluded from the meta-
by decision information and decision assistance interventions analysis.
architecture interventions are an effective tool for changing in- much attention from researchers and policy makers as choice
stances of habitualized behaviors (64). This finding is particularly architecture interventions. Integrating the results of more than
relevant from a policy making perspective as habits tend to be 440 behavioral interventions, the present meta-analysis finds
relatively unresponsive to traditional intervention approaches that choice architecture is an effective and widely applicable
and are therefore generally considered to be difficult to change behavior change tool that facilitates personally and socially desir-
(62). Given that choice architecture interventions can only target able choices across behavioral domains, geographical locations,
the environmental cues that trigger habitualized responses but and populations. Our results provide insights into the overall
not the association between choice environment and behavior effectiveness of choice architecture interventions as well as sys-
per se, it should be noted though that the effects of interventions tematic effect size variations among them, revealing promising
are likely limited to the specific choice contexts in which they are directions for future research that may facilitate the development
implemented. of theories in this still new but fast-growing field of research.
While the present meta-analysis provides a comprehensive Our work also provides a comprehensive overview of the effec-
overview of the effectiveness of choice architecture as a behav- tiveness of choice architecture interventions across a wide range
ior change tool, more research is needed to complement and of intervention contexts that are representative of some of the
complete our findings. For example, our methodological focus most pressing societal challenges we are currently facing. This
on individuals as the unit of analysis excludes a large number of overview can serve as a guideline for policy makers who seek
studies that have investigated choice architecture interventions reliable, evidence-based information on the potential impact of
on broader levels, such as households, school classes, or orga- choice architecture interventions and the conditions under which
nizations, which may reduce the generalizability of our results. they promote behavior change.
Future research should target these studies specifically to add
to the current analysis. Similarly, our data show very high levels
of heterogeneity among the effect sizes of choice architecture Materials and Methods
interventions. Although the type of intervention, the behavioral The meta-analysis was conducted in accordance with guidelines for conduct-
domain in which it is applied, and contextual study characteristics ing systematic reviews (71) and conforms to the Preferred Reporting Items
for Systematic Reviews and Meta-Analyses (72) standards.
account for some of this heterogeneity (SI Appendix, Table S3),
more research is needed to identify factors that may explain the
Literature Search and Inclusion Criteria. We searched the electronic
variability in effect sizes above and beyond those investigated
databases PsycINFO, PubMed, PubPsych, and ScienceDirect using a
here. Research has recently started to reveal some of those po- combination of keywords associated with choice architecture (nudge OR
tential moderators of choice architecture interventions, including “choice architecture”) and empirical research (method* OR empiric* OR
sociodemographic factors such as income and socioeconomic procedure OR design).‡ Since the terms nudge and choice architecture
status as well as psychological factors such as domain knowl- were established only after the seminal book by Thaler and Sunstein (1),
edge, numerical ability, and attitudes (65–67). Investigating these we restricted this search to studies that were published no earlier than
moderators systematically cannot only provide a more nuanced 2008. To compensate for the potential bias this temporal restriction might
understanding of the conditions under which choice architecture introduce to the results of our meta-analysis, we identified additional
facilitates behavior change but may also help to inform the studies, including studies published before 2008, through the reference lists
of relevant review articles and a search for research reports by governmental
design and implementation of targeted interventions that take
and nongovernmental behavioral science units. To reduce the possibly
into account individual differences in the susceptibility to choice confounding effects of publication status on the estimation of effect
architecture interventions (68). Ethical considerations should sizes, we further searched for unpublished studies using the ProQuest
play a prominent role in this process to ensure that potentially Dissertations & Theses database and requesting unpublished data through
more susceptible populations, such as children or low-income academic mailing lists. The search concluded in June 2019, yielding a total
households, retain their ability to make decisions that are in their of 9,606 unique publications.
personal best interest (66, 69, 70). Based on the results of our own
moderator analyses, additional avenues for future research may
include the study of how information processing influences the
‡
effectiveness of varying types of choice architecture interventions Search terms were adapted from Szaszi et al. (73).
PSYCHOLOGICAL AND
COGNITIVE SCIENCES
was classified using a taxonomy developed by Münscher and colleagues (13), to overlapping samples (e.g., in cases where multiple treatment conditions
which distinguishes three broad categories of choice architecture: decision were compared to the same control condition), we computed cluster-robust
information, decision structure, and decision assistance. Each of these cat- SEs, confidence intervals, and statistical tests for the estimated effect sizes
egories targets a specific aspect of the choice environment, with decision (78, 79).
information interventions targeting the way in which choice alternatives To identify systematic differences between choice architecture interven-
are described (e.g., framing), decision structure interventions targeting the tions, we ran multiple moderator analyses in which we tested for the effects
way in which those choice alternatives are organized and structured (e.g., of type of intervention, behavioral domain, and study characteristics using
choice defaults), and decision assistance interventions targeting the way mixed-effects meta-analytic models with random effects on the treatment
in which decisions can be reinforced (e.g., commitment devices). With its and the publication level. All analyses were conducted in R using the
tripartite categorization framework the taxonomy is able to capture and package metafor (80).
categorize the vast majority of choice architecture interventions described Data Availability. Data have been deposited in the Open Science Frame-
in the literature, making it one of the most comprehensive classification work (https://osf.io/fywae/).
schemes of choice architecture techniques in the field (see Table 1 for
an overview). Many alternative attempts to organize and structure choice ACKNOWLEDGMENTS. This research was supported by Swiss National Sci-
architecture interventions are considered problematic because they combine ence Foundation Grant PYAPP1_160571 awarded to Tobias Brosch and Swiss
descriptive categorization approaches, which classify interventions based on Federal Office of Energy Grant SI/501597-01. It is part of the activities of
choice architecture technique, and explanatory categorization approaches, the Swiss Competence Center for Energy Research – Competence Center
for Research in Energy, Society and Transition, supported by the Swiss
which classify interventions based on underlying psychological mechanisms,
Innovation Agency (Innosuisse). The funding sources had no involvement in
within a single framework. The taxonomy we use here adopts a descriptive the preparation of the article; in the study design; in the collection, analysis,
categorization approach in that it organizes interventions exclusively in and interpretation of data; nor in the writing of the manuscript. We thank
terms of choice architecture techniques. We chose this approach to not Allegra Mulas and Laura Pagel for their assistance in data collection and
only omit common shortcomings of hybrid classification schemes, such as extraction.
1. R. H. Thaler, C. R. Sunstein, Nudge: Improving Decisions about Health, Wealth, and 16. C. K. Hsee, J. Zhang, General evaluability theory. Perspect. Psychol. Sci. 5, 343–355
Happiness (Yale University Press, 2008). (2010).
2. R. H. Thaler, C. R. Sunstein, J. P. Balz, “Choice architecture” in The Behavioral 17. A. K. Shah, D. M. Oppenheimer, Easy does it: The role of fluency in cue weighting.
Foundations of Public Policy, E. Shafir, Ed. (Princeton University Press, 2013), pp. Judgm. Decis. Mak. 2, 371–379 (2007).
428–439. 18. H. Allcott, Social norms and energy conservation. J. Public Econ. 95, 1082–1095
3. G. S. Becker, The Economic Approach to Human Behavior (University of Chicago (2011).
Press, ed. 1, 1976). 19. K. Jessoe, D. Rapson, Knowledge is (less) power: Experimental evidence from
4. I. Ajzen, The theory of planned behavior. Organ. Behav. Hum. Decis. Process. 50, residential energy use. Am. Econ. Rev. 104, 1417–1438 (2014).
179–211 (1991). 20. C. A. Roberto, P. D. Larsen, H. Agnew, J. Baik, K. D. Brownell, Evaluating the impact
5. P. C. Stern, Toward a coherent theory of environmentally significant behavior. J. Soc. of menu labeling on food choices and intake. Am. J. Public Health 100, 312–318
Issues 56, 407–424 (2000). (2010).
21. R. P. Larrick, J. B. Soll, Economics. The MPG illusion. Science 320, 1593–1594 (2008).
6. D. Albarracin, S. Shavitt, Attitudes and attitude change. Annu. Rev. Psychol. 69, 299–
22. E. J. Johnson, D. Goldstein, Medicine. Do defaults save lives? Science 302, 1338–1339
327 (2018).
(2003).
7. J. S. B. T. Evans, Dual-processing accounts of reasoning, judgment, and social
23. J. Maas, D. T. D. de Ridder, E. de Vet, J. B. F. de Wit, Do distant foods decrease intake?
cognition. Annu. Rev. Psychol. 59, 255–278 (2008).
The effect of food accessibility on consumption. Psychol. Health 27 (suppl. 2), 59–73
8. H. A. Simon, A behavioral model of rational choice. Q. J. Econ. 69, 99–118 (1955).
(2012).
9. H. A. Simon, Models of Bounded Rationality (MIT Press, 1982).
24. J. M. Martin, M. I. Norton, Shaping online consumer choice by partitioning the web.
10. G. Gigerenzer, W. Gaissmaier, Heuristic decision making. Annu. Rev. Psychol. 62, 451– Psychol. Mark. 26, 908–926 (2009).
482 (2011). 25. M. A. Sharif, S. B. Shu, Nudging persistence after failure through emergency
11. S. Lichtenstein, P. Slovic, Eds., The Construction of Preference (Cambridge University reserves. Organ. Behav. Hum. Decis. Process. 163, 17–29 (2021).
Press, 2006). 26. P. Sheeran, T. L. Webb, The intention-behavior gap. Soc. Personal. Psychol. Compass
12. J. W. Payne, J. R. Bettman, E. J. Johnson, Behavioral decision research: A constructive 10, 503–518 (2016).
processing perspective. Annu. Rev. Psychol. 43, 87–131 (1992). 27. R. H. Thaler, S. Benartzi, Save more tomorrow: Using behavioral economics to
13. R. Münscher, M. Vetter, T. Scheuerle, A review and taxonomy of choice architecture increase employee saving. J. Polit. Econ. 112, 164–187 (2004).
techniques. J. Behav. Decis. Making 29, 511–524 (2016). 28. C. Loibl, L. Jones, E. Haisley, Testing strategies to increase saving in individual
14. D. Kahneman, Thinking, Fast and Slow (Farrar, Straus and Giroux, 2011). development account programs. J. Econ. Psychol. 66, 45–63 (2018).
15. P. Slovic, From Shakespeare to Simon: Speculations—and some evidence—about 29. S. Benartzi et al., Should governments invest more in nudging? Psychol. Sci. 28,
man’s ability to process information. Or. Res. Inst. Res. Bull. 12, 1–19 (1972). 1041–1055 (2017).
42. M. Borenstein, J. P. Higgins, L. V. Hedges, H. R. Rothstein, Basics of meta-analysis: I2 mechanisms in energy decision-making and behaviour. Nat. Energy 5, 952–958
is not an absolute measure of heterogeneity. Res. Synth. Methods 8, 5–18 (2017). (2020).
43. J. L. Vevea, C. M. Woods, Publication bias in research synthesis: Sensitivity analysis 70. C. R. Sunstein, The distributional effects of nudges. Nat. Hum. Behav.
using a priori weight functions. Psychol. Methods 10, 428–443 (2005). 10.1038/s41562-021-01236-z (2021).
44. M. Egger, G. Davey Smith, M. Schneider, C. Minder, Bias in meta-analysis detected 71. A. P. Siddaway, A. M. Wood, L. V. Hedges, How to do a systematic review: A best
by a simple, graphical test. BMJ 315, 629–634 (1997). practice guide for conducting and reporting narrative reviews, meta-analyses, and
45. G. W. Harrison, J. A. List, Field experiments. J. Econ. Lit. 42, 1009–1055 (2004). meta-syntheses. Annu. Rev. Psychol. 70, 747–770 (2019).
46. A. Maki, R. J. Burns, L. Ha, A. J. Rothman, Paying people to protect the environment:
72. D. Moher, A. Liberati, J. Tetzlaff, D. G. Altman; PRISMA Group, Preferred reporting
A meta-analysis of financial incentive interventions to promote proenvironmental
items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Med.
behaviors. J. Environ. Psychol. 47, 242–255 (2016).
6, e1000097 (2009).
47. E. Mantzari et al., Personal financial incentives for changing habitual health-related
behaviors: A systematic review and meta-analysis. Prev. Med. 75, 75–85 (2015). 73. B. Szaszi, A. Palinkas, B. Palfi, A. Szollosi, B. Aczel, A systematic scoping review of
48. L. B. Snyder et al., A meta-analysis of the effect of mediated health communication the choice architecture movement: Toward understanding when and why nudges
campaigns on behavior change in the United States. J. Health Commun. 9 (suppl. work. J. Behav. Decis. Making 31, 355–366 (2018).
1), 71–96 (2004). 74. S. Mertens, M. Herberz, U. J. J. Hahnel, T. Brosch, The effectiveness of nudging: A
49. D. Hagmann, E. H. Ho, G. Loewenstein, Nudging out support for carbon tax. Nat. meta-analysis of choice architecture interventions across behavioral domains. Open
Clim. Chang. 9, 484–489 (2019). Science Framework. https://osf.io/fywae/. Deposited 11 September 2021.
50. H. IJzerman et al., Use caution when applying behavioural science to policy. Nat. 75. M. W. L. Cheung, Modeling dependent effect sizes with three-level meta-analyses:
Hum. Behav. 4, 1092–1094 (2020). A structural equation modeling approach. Psychol. Methods 19, 211–229 (2014).
51. A. S. Kristal, A. V. Whillans, What we can learn from five naturalistic field experi- 76. W. Van den Noortgate, J. A. López-López, F. Marín-Martínez, J. Sánchez-Meca,
ments that failed to shift commuter behaviour. Nat. Hum. Behav. 4, 169–176 (2020). Three-level meta-analysis of dependent effect sizes. Behav. Res. Methods 45, 576–
52. G. Loewenstein, N. Chater, Putting nudges in perspective. Behav. Public Policy 1, 594 (2013).
26–53 (2017).
77. W. Van den Noortgate, J. A. López-López, F. Marín-Martínez, J. Sánchez-Meca, Meta-
53. D. J. Hardisty, E. J. Johnson, E. U. Weber, A dirty word or a dirty world?: Attribute
analysis of multiple outcomes: A multilevel approach. Behav. Res. Methods 47, 1274–
framing, political affiliation, and query theory. Psychol. Sci. 21, 86–92 (2010).
1294 (2015).
54. T. A. Homonoff, Can small incentives have large effects? The impact of taxes versus
bonuses on disposable bag use. Am. Econ. J. Econ. Policy 10, 177–210 (2018). 78. A. C. Cameron, D. L. Miller, A practitioner’s guide to cluster-robust inference. J. Hum.
55. E. J. McCaffery, J. Baron, Thinking about tax. Psychol. Public Policy Law 12, 106–135 Resour. 50, 317–372 (2015).
(2006). 79. L. V. Hedges, E. Tipton, M. C. Johnson, Robust variance estimation in meta-regression
56. S. Mertens, U. J. J. Hahnel, T. Brosch, This way please: Uncovering the directional with dependent effect size estimates. Res. Synth. Methods 1, 39–65 (2010).
effects of attribute translations on decision making. Judgm. Decis. Mak. 15, 25–46 80. W. Viechtbauer, Conducting meta-analyses in R with the metafor package. J. Stat.
(2020). Softw. 36, 1–48 (2010).