0% found this document useful (0 votes)
6 views

Computer Assisted Learning - 2021 - Biedermann - Digital self‐control interventions for distracting media multitasking ‐ A

This systematic review examines the effectiveness of various digital self-control interventions designed to mitigate distractions from media multitasking. The review identifies 28 interventions, categorizing them based on features and outcomes, and finds that while some interventions show promise, especially those that incorporate sanctions, overall effectiveness is limited by small sample sizes and short study durations. The authors highlight significant research gaps and suggest directions for future studies to better understand the impact of these interventions on digital distraction.

Uploaded by

AHsu
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

Computer Assisted Learning - 2021 - Biedermann - Digital self‐control interventions for distracting media multitasking ‐ A

This systematic review examines the effectiveness of various digital self-control interventions designed to mitigate distractions from media multitasking. The review identifies 28 interventions, categorizing them based on features and outcomes, and finds that while some interventions show promise, especially those that incorporate sanctions, overall effectiveness is limited by small sample sizes and short study durations. The authors highlight significant research gaps and suggest directions for future studies to better understand the impact of these interventions on digital distraction.

Uploaded by

AHsu
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

Received: 3 November 2020 Revised: 19 May 2021 Accepted: 23 May 2021

DOI: 10.1111/jcal.12581

REVIEW ARTICLE

Digital self-control interventions for distracting media


multitasking - A systematic review

Daniel Biedermann1 | Jan Schneider1 | Hendrik Drachsler1,2,3

1
Information Center for Education, DIPF
Leibniz Institute for Research and Information Abstract
in Education, Frankfurt am Main, Germany Digital distractions can interfere with goal attainment and lead to undesirable habits
2
Educational Science Faculty, Open University
that are hard to get red rid of. Various digital self-control interventions promise sup-
of the Netherlands, Heerlen, The Netherlands
3
Computer Science Faculty, Goethe port to alleviate the negative impact of digital distractions. These interventions use
University, Frankfurt am Main, Germany different approaches, such as the blocking of apps and websites, goal setting, or visu-
Correspondence alizations of device usage statistics. While many apps and browser extensions make
Daniel Biedermann, DIPF j Leibniz-Institute for use of these features, little is known about their effectiveness. This systematic review
Research and Information in Education,
Rostocker Straße 6, 60323 Frankfurt am Main, synthesizes the current research to provide insights into the effectiveness of the dif-
Germany. ferent kinds of interventions. From a search of the ‘ACM’, ‘Springer Link’, ‘Web of
Email: [email protected]
Science’, ’IEEE Xplore’ and ‘Pubmed’ databases, we identified 28 digital self-control
Funding information interventions. We categorized these interventions according to their features and
Leibniz-Kooperative Exzellenz
their outcomes. The interventions showed varying degrees of effectiveness, and
especially interventions that relied purely on increasing the participants' awareness
were barely effective. For those interventions that sanctioned the use of distractions,
the current literature indicates that the sanctions have to be sufficiently difficult to
overcome, as they will otherwise be quickly dismissed. The overall confidence in the
results is low, with small sample sizes, short study duration, and unclear study con-
texts. From these insights, we highlight research gaps and close with suggestions for
future research.

KEYWORDS
digital distractions, media multitasking, self-control, self-regulation

1 | I N T RO DU CT I O N content are often equally accessible. However, while learning can be


unpleasant and exhausting, the distracting entertainment is
In learning scenarios such as blended learning or online courses, engineered to be fun and addicting, to tempt the users to spend as
where learning is largely self-directed, digital content has a Janus- much time on a platform as possible (Eyal, 2014).
faced role. On the one hand are the potential benefits, such as the Awareness of the detrimental effects of digital distractions is
access to knowledge resources and the opportunity to exchange widespread. When students were asked about their smartphone
information quickly. On the other hand, there are many ways that usage, more than 60% responded that they felt they were overusing
learners can be distracted from their learning goals, for instance by their smartphone and that it was a distraction in class (Ko et al., 2015).
watching entertaining videos, browsing social networks, or playing In an observation of learner behaviour, Rosen et al. (2013) observed
video games. Both the goal-congruent and the goal-incongruent that participants switched tasks on average every 6 min, mostly to

This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium,
provided the original work is properly cited.
© 2021 The Authors. Journal of Computer Assisted Learning published by John Wiley & Sons Ltd.

J Comput Assist Learn. 2021;37:1217–1231. wileyonlinelibrary.com/journal/jcal 1217


13652729, 2021, 5, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/jcal.12581, Wiley Online Library on [08/03/2025]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
1218 BIEDERMANN ET AL.

social media. They also observed negative associations between the have conducted a review of consumer-oriented digital self-control
frequency of switching to social media and the GPA. This negative interventions from the various app stores. In this review, the authors
association between the frequency of using digital distractions during created an integrative model of self-control and habitual behaviour that
learning and academic performance was observed in several other aims to explain how these interventions can influence behaviour in the
studies (Kirschner & Karpinski, 2010; Masood et al., 2020). face of distractions (see Figure 1).
Different disciplines use different terminology to describe the phe- As a result of their work, Lyngs et al. (2019) identified 367 different
nomenon of overuse of distracting content: The term ’media multitask- interventions that they sorted into four broad categories, which reflect
ing’ is often used in the psychological research literature, and there are different targets where these interventions influence the decision to
validated surveys to measure it (e.g., Baumgartner et al., 2017). While (not) use a distraction. Their categories are: (1) blocking/removing,
studies find overall that media multitasking is likely to be detrimental to (2) self-tracking, (3) goal advancement and (4) reward/punishment.
academic performance (van der Schuur et al., 2015), there is still debate The interventions with features in the category blocking or remov-
about positive effects of media multitasking. Another term that is often ing block the access to programs or websites entirely, or they set limits
used to describe distractions at work is cyberloafing (Varol & on the duration that a user can access distracting content. In some
Yıldırım, 2019). The term cyberloafing describes behaviour that is unde- cases, the user can pause the blocking with additional friction, like a
sirable, especially from the perspective of third parties such as menial task or a password. Other interventions in this category remove
employers. Whether the behaviour referred to as cyberloafing is bad for particularly distracting features, such as a newsfeed or content recom-
the people themselves is, of course, another question. Meanwhile, in the mendations, from popular websites. These interventions can prevent
human-computer interaction (HCI) literature, a broad range of terms like habitual use of digital distractions, giving users the chance to reflect on
‘smartphone non-use’ (Hiniker et al., 2016), ‘self-interruption’ (Kim, Cho, their behaviour and make a conscious decision instead of following an
et al, 2017), or ‘self-control failure’ (Lyngs et al., 2020) is used. Self-con- unconscious impulse.
trol has a clear construct and a model that further sheds light on the If the behaviour is driven by conscious decisions, the users can
mechanisms behind frequent use of distracting content. Self-control benefit from interventions with features from the self-tracking and
intervention is also the term that is used for the digital interventions that goal advancement category. Self-tracking interventions display usage
are the subject of this review (Lyngs et al., 2019; Schwartz et al., 2021). statistics or the amount of time that a user spent on distracting activi-
ties. Users are supposed to monitor themselves and to decide when
they have overused digital content. Goal setting interventions remind
2 | THEORY AND RELATED WORKS the users of a task or a goal that they have set for themselves.
Lastly, the category reward/punishment contains interventions with
If and how often someone follows a short-term gratifying impulse, features that provide rewards, points, or achievements that users can
instead of working towards their long-term goals, depends on several gain for abstaining from distractions. The rewards or punishments
factors that are captured in the trait of self-control. Since this trait is within these interventions are sometimes in the form of social actions
relatively stable (Coyne & Wright, 2014) and even partially hereditary in which users support or sanction each other. These features influence
(Willems et al., 2019), some people are more susceptible to disrupting the value that users ascribe to resisting using a digital distraction
their learning than others. (Shenhav et al., 2013).
Another factor that influences the outcome of self-control conflicts While the review by Lyngs et al. (2019) resulted in an extensive
is habitual behaviour (Duckworth et al., 2019), which refers to behaviour overview of existing features, it is not known how effective they are
that has reached a degree of automaticity such that it is no longer at reducing the negative impact of digital distractions. In the context
directed by conscious goals. It develops through frequent repetition of of self-directed learning with digital media, one can imagine scenar-
the same behaviour in a stable context (Fiorella, 2020;Mazar & ios for the various interventions in which they work well, or less well.
Wood, 2018; Wood & Rünger, 2016). Future behaviour is then initiated For example, the approach of complete blocking does prevent dis-
by exposure to the same contextual cues, which can be locations, pre- tractions from being used. However, complete blocking also means
ceding actions, or mental states (Mazar & Wood, 2018; Wood & that a learner can no longer access the recording of a course if it is
Rünger, 2016). This is especially challenging given the portable and ubiq- hosted on the same video platform as the entertainment. For such
uitous nature of digital devices. Individuals can develop habitual behav- scenarios, only interventions that still allow access to a platform are
iour in a wide range of contexts (Bayer & LaRose, 2018), and it is not usable at all. These more permissive interventions then raise ques-
unreasonable to expect that many disruptions of learning are due to the tions, like: How well do they prevent entertainment from being used
habitual use of digital devices. The often-habitual nature of using digital instead of learning? Is it sufficient that learners can see their usage
devices can also explain why especially ’heavy users' severely underesti- statistics in order to reduce distractions at critical moments? Are
mate the amount of time they spend with digital distractions (H. Lee self-set time limit goals adhered to, or are additional restrictions nec-
et al., 2014). essary? With this review, we address these gaps by providing a sys-
For both the conscious and the not-so-conscious uses of distrac- tematic literature review of peer-reviewed publications that studied
tions, digital self-control interventions aim to provide support where changes in the use of digital distractions when participants used digi-
the individual self-control is no longer sufficient. Lyngs et al. (2019) tal self-control interventions.
13652729, 2021, 5, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/jcal.12581, Wiley Online Library on [08/03/2025]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
BIEDERMANN ET AL. 1219

F I G U R E 1 An extended dual systems model of self-control that Lyngs et al. (2019) adapted from Shea et al. (2014). The hexagons list
interventions that are connected via dashed lines to the targets where they intervene in the process from sensory input to action

3 | THE PRESENT STUDY sizes of the interventions (See the Data Extraction and Analysis
section).
In this study, we contribute to the research on digital self-control
interventions with a systematic review (Petticrew & Roberts, 2006)
where we examine the effectiveness of these interventions and their 4 | METHOD
underlying features. We used the following research questions to
guide our study: For this review, we followed the Preferred Reporting Items for Sys-
RQ 1: Which types of intervention have evidence for achieving tematic Reviews and Meta-Analysis guidelines (PRISMA) (Moher
changes in the use of digital distractions? et al., 2009), which sets out the minimum standards of reporting that
RQ 2: How effective were the interventions at alleviating digital should be present in a systematic review, and a flow of information to
distractions? apply for the study selection.
To answer RQ 1, we first looked for published research regard-
ing digital interventions for digital distractions. We then categorized
the identified interventions based on the categories proposed by 4.1 | Eligibility criteria
Lyngs et al. (2019). We present these results in a narrative form
(compare Petticrew & Roberts, 2006, p. 165f.) where we describe We included publications that reported the effects of digital interven-
the features of the various interventions. To answer RQ 2, we ana- tions to alleviate digital distractions. Our inclusion criteria were publi-
lyzed the reported results and, when possible, calculated the effect cations that…
13652729, 2021, 5, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/jcal.12581, Wiley Online Library on [08/03/2025]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
1220 BIEDERMANN ET AL.

• …were written in English. 4.1.1 | Search strategy


• …were published in a peer-reviewed outlet.
• …applied a digital intervention to reduce the use of distractions on We performed a search of the relevant databases and combined these
a computer, smartphone, or a wearable device. results with a snowballing approach to obtain additional literature
• …reported outcome measures related to changes in the use of digi- from the references and the citing literature (see Figure 2). For the
tal distraction. Since an effect for a reduction in use is guaranteed database search, the search concepts were (1) digital distractions and
for interventions that completely block the distractions, we (2) interventions. From the terminologies used in the literature, we
included use of the intervention as an outcome. The rationale is arrived at the search term (‘self-control’ OR ‘self-regulation’ OR ‘mind-
that if the blocking is used more often, distractions can be wandering’ OR cyberloafing OR ‘media multitasking’ OR willpower OR
accessed less frequently. ‘digital distractions’ OR ‘smartphone addiction’) for the topic of digital
• …had a study design that compared outcomes under treatment distractions. For the interventions, the terms were (intervention* OR
with the outcomes of a non-treatment group, or a single-group app OR browser* OR tracker OR self-monitoring OR digital OR
design with a baseline measurement. smartphone). Both terms were combined with a logical AND. We
• …investigated a nonclinical population, for example, no special searched the databases with the broadest applicable search scope.
needs or ADHD population. The databases were: ‘Web of Science’ (scope: Topic), ‘Springer Link’

FIGURE 2 Flowchart of the paper inclusion process


13652729, 2021, 5, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/jcal.12581, Wiley Online Library on [08/03/2025]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
BIEDERMANN ET AL. 1221

(scope: All), ‘ACM Digital Library’ (scope: Anywhere), ‘IEEE XPlore’ results. This assessment only determines how well the studies are
(scope: Full Text and Metadata) and ‘Pubmed’ (scope: All Fields). We suited to answer our research questions, and does not represent a
copied all the results into a spreadsheet, where the publications were general categorization into ’good’ or ’bad’ studies.
filtered first by title. Next, we read the abstracts and, for those publi-
cations which appeared eligible based on their abstract, we read the
full text. From the results of the database search that we considered 4.4 | Search results
as relevant, we looked for further papers by incorporating the
referenced literature and the literature which cited the publication. We found 16 publications eligible for this review, which contained a
For this step, we used ‘Google Scholar’, which had the most extensive total of 28 interventions. The interventions are listed with a brief
results for citing literature. description in Table 1. We assigned an identifier consisting of the
number sign # and a number to each intervention (#1–#16). Nine pub-
lications contained more than one intervention and we appended a
4.2 | Data extraction and analysis lower-case alphabetic letter to their id (e.g., #7a, #7b).

For each publication, we extracted a description of the intervention(s),


the research questions or hypotheses, the outcomes, the year of pub- 5 | FI ND I NG S A N D A N A LY SIS
lication, the study design, the study population, recruitment criteria,
the study duration, the target distraction, and the measured effects of We start this section with a high-level overview of the interventions
the intervention(s). If a publication contained multiple interventions or and the outcomes that the studies report, followed in the subsections
multiple studies for one intervention, we extracted the parameters by a narrative description of the interventions. Regarding the multiple
mentioned above separately for each intervention and each study. interventions within one publication, it should be noted that some
To categorize the features, the first and second authors of the interventions within one publication often differed only in minor
paper independently rated the interventions in the publications details, such as the time spent on a distraction before an intervention
according to the coding criteria. The inter-rater agreement between started. With a few exceptions, we categorized the interventions fol-
authors was 91%. Consensus about the remaining 9% of the categori- lowing the taxonomy that was identified in the review of the commer-
zations was reached after a brief discussion. cially available interventions (Lyngs et al., 2019) that we describe
To calculate effect sizes and confidence intervals, we used the follow- above. We deviated in the two following categories: First, we use the
ing versions of the calculation of standardized mean difference Cohen's d. term ’awareness' (cf. Parry & le Roux, 2019) instead of ’self-tracking’
For between subjects-design we calculated the difference of the means because the concept of increasing the users' awareness is more
(M) divided by the average standard deviation (SD) (Lakens, 2013): appropriate. Second, we placed interventions that modified the fea-
tures or content into their own category instead of together with the
M1  M2 blocking features. Blocking is functionally different from the modifica-
SD1 þSD2 
2 tion of features because blocking completely prevents access to the
distraction. Feature modification comes into effect only after the con-
For within-subjects design, we calculated the effect size as the tent has already been accessed, and a change in activities has already
difference of the means divided by the pooled SD (Lakens, 2013): taken place.
The interventions monitored and targeted different kinds of digi-
M1  M2 tal activities on different devices. The monitored activities were either
pffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
2 2

ðn1 1ÞSD1 þðn2 1ÞSD2
n1 þn2 2 smartphone apps, computer programs, or websites (see Table 3). The
monitoring was mostly limited to the start of the activity and
the duration with which it was active. In one publication, the analysis
For publications that did not allow conversion to these effect was extended to more fine-grained interactions within the dis-
size measures because of missing data in their report, we tried to traction#16a,#16b. The interventions were active on either mobile
obtain the data by contacting the corresponding authors of the devices (primarily smartphones) and computers. Only one intervention
publication. synchronized the monitoring across smartphones and computers.
From the interventions on mobile devices, six of the interventions
monitored all app activity, nine monitored activity only for apps from
4.3 | Risk of bias and quality assessment a user- or application-defined list, and two interventions monitored
only the activity in one specific app. All of these interventions that ran
To assess the study quality and the risk of bias in the individual stud- on a computer were designed to monitor website activity, and four
1
ies, we followed the study quality assessment tools by the NHLBI. In interventions also monitored the programs that the participants used.
addition to the criteria from those checklists, we considered study Eleven of the interventions which monitored the browser activity
duration, the conduct of a follow-up, and comprehensible reporting of used specific user- or application-defined URLs as intervention
13652729, 2021, 5, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/jcal.12581, Wiley Online Library on [08/03/2025]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
1222 BIEDERMANN ET AL.

TABLE 1 The publications that were included in this review and a description of the respective interventions

ID References Description
#1 Lottridge et al. (2012) Browser redesign to frame tabs either as ‘work’ or ‘non-work’. ‘Work’ tabs were larger, while other tabs
were smaller and placed on the right side. A ‘work versus non-work’ ratio progress bar was added to
the browser status bar.
#2a Ko et al. (2015) NUGU - A smartphone app that encouraged participants to set time goals during which they commit to
#2b not use their smartphone. One group had social comparison features enabled (#2a) and the other
group did not (#2b).
#3a Foulonneau et al. (2016) TILT - A smartphone app that displayed device use in several visualizations: messages, notifications and
#3b home screen widgets. The content of the visualizations was personalized (#3a) or static (#3b).
#4 Hiniker et al. (2016) MyTime - An app to set time limit goals for monitored apps. Upon reaching those goals, reminders to
stop using those apps popped up.
#5a Y.-H. Kim et al. (2016) TimeAware - Information display of time spent on the device with either a productivity (#5a) or a
#5b distraction (#5b) framing.
#6 Terry et al. (2016 Daily text messages with facts about the negative effects of media multitasking.
#7a Whittaker et al. (2016) meTime - An always-on dashboard that showed the last used applications. Longer use of an application
#7b resulted in a larger display of this application in the dashboard.
#8 Kim, Cho, et al. (2017) PomodoLock - Distracting apps and websites were blocked for 25-min intervals on both smartphone and
PC.
#9a Kovacs et al. (2018) HabitLab - A browser extension with many different interventions. The study with intervention #9a
#9b investigated the effect of rotating interventions on attrition, and study #9b the effect of providing
additional information and control over the interventions.
#10a Okeke et al. (2018) Repeating smartphone vibrations when a monitored website was visited longer than allowed (5 min
#10b #10a, personalized #10b).
#11a Kim, Park, et al. (2019) Lock n’ Type - Participants had to enter a string of random digits before opening monitored apps.
#11b Digit input lengths were zero (#11a), 10 (#11b), and 30 (#11c).
#11c
#12a Kim, Jung, et al. (2019) Goalkeeper - different lockout mechanisms based on individual time goals. From reminder notifications
#12b (#12a), to successively longer lockout durations (#12b), and full-day phone lock upon exceeding the
#12c time (#12c).
#13a Tseng et al. (2019 UpTime - Chatbot-based website blocker (#13a) compared with a time-based website blocker (#13b).
#13b With #13a, users had to provide reasons for visiting distracting websites.
#14 Holte and Ferraro (2020) Changing the display colours of Smartphones to grayscale to decrease the attractiveness of distracting
content.
#15 Loid et al. (2020) Evaluation of the effect of push notifications on screen time and phone checking behaviour.
#16a Lyngs et al. (2020) Two alterations of the Facebook website. In #16a, users had to set action goals when entering the site;
#16b In #16b, the newsfeed was removed.

targets, while two interventions monitored only the activity on a spe- was within a program or an app, or via instant message, or as a push
cific website. The following subsections give more details regarding notification. (b) the content that was presented to the participants,
the description of the intervention features and the outcomes that whether it was statistics, messages, or graphics.
were achieved (Figure 3). The most common form of presentation was the display of usage
statistics: Whittaker et al. (2016) created a computer program#7a,#7b
which displayed the most-used programs from the last 30 min to the
5.1 | Description of intervention features participants, showing the most-frequently used programs larger than
the less frequently used ones. Lottridge et al. (2012) modified the web
5.1.1 | Awareness features browser to show statistics about the frequency of work versus non-
work websites that the participants visited#1. Other interventions
Interventions with awareness features inform the participants about contained visualizations of usage statistics on top of various other
their use of distracting activities. This information is presented with features#2a,#2b,#12a,#12b,#12c that are described in the following
the aim to start a process of self-reflection so that the participants sections.
themselves decide to spend less time on digital distractions. Another common form of presentation were different kinds of
Awareness interventions varied according to (a) the mode in notifications that alerted the participants: text messages#6, push
which they presented the information to the participants, whether it notifications#3a,#3b,#15a, and popup message dialogues#11a,#4,#12a.
13652729, 2021, 5, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/jcal.12581, Wiley Online Library on [08/03/2025]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
BIEDERMANN ET AL. 1223

F I G U R E 3 Parameters that describe the various interventions. The ‘Device’ column lists the device type on which the intervention was
active (computer or mobile device). The columns on monitored activities and feature categories list these parameters for each intervention. The
columns under the ‘Outcome Categories’ header list the outcomes that were achieved with the intervention. A ‘+’ signifies that the intervention
had significant positive results on this outcome, a ‘n.s.’ that there were no significant effects on this outcome, and a ‘‘ stands for negative
effects on this outcome

The most insistent awareness intervention, created by Okeke asked the participant to set a time limit for their distracting
et al. (2018), started smartphone vibrations when a time quota for a activities#2a,#2b,#4,#12a,#12b,#12c, and free text goals were less com-
#10a,#10b
distracting website (in their case, Facebook) was exceeded . mon#16a,#4. To sanction goal deviations, the interventions used
The smartphone vibrated until the user left the distracting website. blocking of distractions#12b,#12c, repeated reminders#16a,#12a,#4, or
With two exceptions, the interventions informed the participants the reduction of a virtual score#2a,#2b.
about the extent of distracting activity use. The first exception were Kim, Jung, et al. (2019) investigated sanctions of exceeding time-
the SMS notifications in the intervention from Terry et al. (2016), goals with three different interaction lockout mechanisms (ILM): In
which contained facts about the negative effects of media the Non-ILM condition#12a, users received only notifications that
#6
multitasking . The second deviation concerned the framing of the reminded them of their time limit. In the Weak-ILM condition#12b, the
content, where all but one intervention focused on the distracting participants' phones were locked for increasing durations, and Strong-
activities. To investigate the effect of a positive framing, Y.-H. Kim ILM#12c locked the users' phones for the rest of the day after exceed-
et al. (2016) used visualizations for the use of productive programs ing their time limit goal.
#5a
like word processors , and compared this with distraction-focused Hiniker et al. (2016) evaluated a smartphone app which supported
visualizations#5b. goals that were both time- and action-based #4
. Participants were
asked to enter a time limit goal and an optional free text goal for the
use of monitored apps. After exceeding the self-set time limit, the par-
5.1.2 | Goal-advancement features ticipants were reminded of the excess with a warning prompt. Follow-
ing this prompt, they could close the app, request a time extension, or
We considered a feature as goal-advancement if participants had simply dismiss it.
the option to set goals related to a reduction of time spent on digi- Lyngs et al. (2020) modified the Facebook homepage such that
tal distractions. The goals were either time-based or action-based, participants were prompted to enter free text action goals on each
where the participants could enter their goals in a free text. The visit#16a. The goal that the participants entered was periodically
majority of the interventions with goal-advancement features repeated during their visit on the website.
13652729, 2021, 5, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/jcal.12581, Wiley Online Library on [08/03/2025]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
1224 BIEDERMANN ET AL.

5.1.3 | Blocking features the interventions that were active in Habitlab at the time of their
studies, but they do mention a few example interventions, like a news
An intervention was considered as containing blocking features if it feed blocker for Facebook, or hiding ’related videos' on YouTube.
prevented access to the distraction. For blocking interventions, we found Lastly, Lyngs et al. (2020) used a browser extension to remove
interventions where the blocking was rigorous#8,#10,#12b,#12c,#13b and the newsfeed#16b from the Facebook homepage.
others where lifting of the block could be negotiated#11b,#11c,#13a,#12b.
Mark et al. (2018) blocked distractions during work time with the
’freedom’ software, a blacklist-based website blocker#10. Other inter- 5.1.5 | Reward and punishment
ventions were less rigorous and blocked distractions only for specific
time intervals. Kim, Cho et al. (2017) created a Pomodoro-timer inter- Ko et al. (2015) added social support and point reward features to their
vention on the usage of distracting websites. In the Pomodoro method, ’NUGU’ smartphone app. The participants were asked to specify a dura-
users set a time interval (typically 25 min), during which they plan to tion for which they wanted to stop using their smartphone (e.g., ‘studying
work productively. After this time interval, they get a break, typically for 30 minutes’). These limiting sessions were conceptualized as ‘mis-
5 min, during which they are free to do whatever they like, before sions’, and completing a mission gave the users virtual points. If a partici-
starting the next Pomodoro session. pant used the smartphone for anything except incoming calls, the mission
In two publications, the block was not absolute, and could be failed. Participants could also form groups, start missions together, and
lifted: Tseng et al. (2019) created the UpTime-system#13a, a website- compare their time-limiting efforts with those of their peers#2a.
blocking system with a chatbot to control a time-based blocking sys-
tem for the transition from break to work. A break was
operationalized as an absence of user input for at least 5 min. Upon 5.2 | Outcomes
resumption of work, the chatbot was activated and started disabling
access to distracting websites for the next 25 min. The participants For the outcome measurements, we clustered the outcomes into the
could negotiate a lift of the block by navigating through a conversa- categories (a) time that participants spent on the distraction, (b) the
tion with the chatbot. The chatbot also suggested to start blocking frequency with which a participants started a distraction, (c) the total
sessions when a participant spent more than 15 min on distracting time that participants spent on the device, and (d) measures of using
websites. This system was compared with a time-based blocking sys- and interacting with the intervention.
#13b
tem (cf. #8). Kim, Park, et al. (2019) created the Lock n’ Type inter- Measures for the time that participants spend with the distraction
vention, which required participants to enter a string of random digits are a straightforward way to gauge the effectiveness of an interven-
on their smartphone before they could open a blacklisted apps. They tion. In this category, we included outcomes that measured how much
tested three different variants of this intervention, which each time the participants spent on websites or in apps that were marked
#11a,#11b,#11c
required different lengths of random digits . as a distraction by the participants themselves or by the researchers.
Summarized in Table 2, the time spent on distraction was measured
for 17 interventions.
5.1.4 | Modification of content Measures of the frequency with which distractions are started
subsume measures such as navigating to a URL, starting an app, or
For this review, features were considered as ’Modification of Fea- unlocking the phone. Changes in the frequency of distraction starts
tures or Content’, if they modified certain aspects of a digital dis- were measured for nine interventions, summarized in Table 3.
traction to make it still usable, but less appealing to the Time spent on device measures does not discriminate between dif-
participants. To achieve this, these interventions remove non- ferent activities on the device and can also include the use of apps or
essential parts of the distraction that are typically designed to con- programs related to work or learning. These measures include the time
vince the users to spend more time with an app or a website. spent across all apps or on all URLs. Time spent on the device was mea-
Holte and Ferraro (2020) switched their participants' smartphone sured for 16 interventions. The results are summarized in Table 4.
#14
displays to grayscale in order to reduce gratification from dis- Measures of how a participant interacts with an intervention
tracting activities. The grayscale filter was an integrated feature of can also provide indications of the effectiveness of an intervention.
their participants' smartphones, and the filter was active at all times For blocking interventions, an active intervention means that no
and for all apps. Lottridge et al. (2012) modified the web browser tab distracting activity can be started in the first place. Thus, a partici-
#1
bar so that tabs that were classified as work-related had their colour pant that activated such an intervention more often, might have
contrast enhanced, and were made larger. Tabs from non-work URLs made the intervention more effective than a participant that rarely
were made smaller and always displayed on the right. activated the intervention. These measures can, furthermore, hint
Kovacs et al. (2018) report about ’Habitlab’, a browser extension as to whether the intervention failed because it was generally not
that rotates behaviour change interventions for distracting websites, suited to achieve the task, or because the participants simply did
that is, the participants experienced different interventions between not use it. The results for this outcome category are summarized in
visits to the same site#9a,#9b. The authors are not specific about all of Table 5.
13652729, 2021, 5, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/jcal.12581, Wiley Online Library on [08/03/2025]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
BIEDERMANN ET AL. 1225

T A B L E 2 The measures that we


ID Outcome Measure Effect size [95% CI]
subsumed under the category time that
participants spent on the distraction #1 Time on distraction Time in non-work URLs (+)
#4 Time on distraction Time in monitored apps (+)
#5a Time on distraction Time in monitored apps/URLs n.s.
#5b Time on distraction Time in monitored apps/URLs (+)
#6 Time on distraction MTUAS Scale n.s.
#7a Time on distraction Time on Facebook 0.85 [0.47, 1.22]
#7a Time on distraction Browsing time 0.74 [1.11, 0.37]
#7b Time on distraction Time on Facebook (+)
#9a Time on distraction Time in monitored URLs (+)
#10a Time on distraction Time on Facebook 0.49 [0.23, 1.22]
#10b Time on distraction Time on Facebook 0.60 [0.05, 1.25]
#11a Time on distraction Time in monitored apps ()
#11b Time on distraction Time in monitored apps ()
#11c Time on distraction Time in monitored apps ()
#13a Time on distraction Time in monitored URLs 0.14 [1.04, 1.33]
#13b Time on distraction Time in monitored URLs n.s.
#14 Time on distraction Use of social media apps 0.44 [0.11, 0.77]
#14 Time on distraction Use of video player n.s.
#14 Time on distraction Use of web browser 0.46 [0.13, 0.78]
#15 Time on distraction Self-report n.s.
#16a Time on distraction Time on Facebook 1.62 [0.90, 2.33]
#16a Time on distraction Facebook visit length n.s.
#16b Time on distraction Time on Facebook n.s.
#16b Time on distraction Facebook visit length 0.75 [0.09, 1.40]

Note: When the reporting did not allow the conversion into Cohen's d, the measure is marked with a (+)
if a significant reduction in the time that participants spent on the distracting activity was reported. No
significant effects for this measure are marked with a ‘n.s.’, and instances where an increase in the time
spent on the distracting activity was observed are marked with a ().

T A B L E 3 The measures that we


ID Outcome Measure Effect size [95% CI]
subsumed under the category frequency
of distraction starts #1 Distraction start Navigation to monitored URLs 1.84 [0.56, 2.22]
#2a Distraction start Opening apps 0.56 [0.09, 1.04]
#2b Distraction start Opening apps n.s.
#11a Distraction start Opening monitored apps (+)
#11a Distraction start Discouraged app starts (+)
#11b Distraction start Opening monitored apps (+)
#11b Distraction start Discouraged app starts (+)
#11c Distraction start Opening monitored apps (+)
#11c Distraction start Discouraged app starts (+)
#13a Distraction start Periods with visits to monitored URLs 0.64 [0.56, 1.84]
#13b Distraction start Periods with visits to monitored URLs n.s.
#15 Distraction start Number of phone checks n.s.
#16a Distraction start Number of visits to Facebook 1.62 [0.90, 2.33]
#16b Distraction start Number of visits on Facebook n.s.

Note: When the reporting did not allow the conversion into Cohen's d but still reported
significant reductions, the entry is marked with a (+). No significant effects are marked
with a ‘n.s.’.
13652729, 2021, 5, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/jcal.12581, Wiley Online Library on [08/03/2025]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
1226 BIEDERMANN ET AL.

5.2.1 | Awareness intervention effects A measure that is particularly relevant for blocking interventions
is how frequent the blocking is activated. Participants in a blocking
Awareness interventions led to reductions in time spent on distrac- condition#8 completed more limiting sessions in the first week than a
tions and time on the device both through the display of usage stat- control group which only monitored themselves. Over the full two-
istics#5a,#5b,#7a,#7b and notifications#10a,#10b. week duration, these differences were no longer observable.
The interventions #7a and #7b achieved a medium to large effect
for time on distraction and total time on the device, but both interven-
tions were active for only 2 days (see also Table 6). Dashboard inter- 5.2.4 | Content modification intervention effects
ventions led to less time on distraction only with distraction-framed
content#5b, but not with a productive framing#5a. Neither distraction Setting the smartphone display to grayscale#14 led to a reduction in
nor productivity framing #5a,#5b
made a difference regarding how long total phone use (d = 0.39), social media use (d = 0.44), and internet
and how often the participants accessed the information. browser use (d = 0.46). No effect on video player use was found
Awareness notifications only achieved positive effects when they Removing the Facebook newsfeed#16b resulted in a reduction for
were insistent. For usage monitoring via notifications, no effect was the duration of an individual session (d = 0.75). No effects were found
#3a,#15
observed on total time on device , and frequency of distraction for the total time spent on the distraction or the starts of the dis-
starts#15. Regular messages about the negative effects of excessive tracting activity.
media consumption also had no positive effects#6. Only the smartphone
vibrations to remind of overuse led to less time spent on the distraction
with personalized#10a (d = 0.49) and static time quotas#10b (d = 0.69). 5.2.5 | Reward intervention effects

When social support features were added to a goal-advancement


5.2.2 | Goal-advancement intervention effects intervention#2a, participants set significantly more time-limit goals per
day (d = 1.87), reduced their time on the device (d = 0.83), and
Time-based goal setting with warning prompts led to reductions in started distractions less often (d = 0.56). These changes were not
time on device, and the amount of time spent on distractions in one observed in the variant without the social support features#2b.
intervention#4, while in other interventions, these prompts of exceed-
ing a time limit did not result in a reduction in time on device until the
excess was sanctioned#12a. When sanctions in the form of device 5.2.6 | Effects of interventions with mixed feature
lockout#12b,#12c were added, time on device was reduced (#12b: contribution
d = 0.35, #12c: d = 54).
When participants set action goals in addition to time goals, they For some of the interventions with multiple features, the effect of the
were more likely to follow prompts to leave a distraction#4. Setting intervention cannot be assigned to a single feature. The browser rede-
action goals immediately before the start of a distracting activity#16a sign#1 achieved reductions for the time on distraction and the start of
also led to a large reduction in total time spent on the distraction distractions. However, during the intervention period there were fewer
(d = 1.62) and the frequency of distraction starts (d = 1.62). Changes URLs total visited (d = 1.39), and fewer total tabs open in general, indi-
in the duration of individual sessions were not significant. cating that there was an overall reduction in activity. Rotating different
behaviour change interventions led to a reduction in time spent on dis-
traction#9a. The rotation of features was also the focus of the most
5.2.3 | Blocking intervention effects thorough investigation of intervention use. Kovacs et al. (2018) found
that rotating the active interventions led to less time on distraction, but
Blocking the access to distractions with a task convinced participants also to more attrition, that is, uninstalling the intervention#9a. The addi-
to not start distractions proportionally to the task difficulty: Requiring a tional information and the additional options#9b that were granted to
30-digit input before opening an app #11c
was more effective (d = 0.47) participants in the follow-up study reduced the attrition significantly.
than a 10-digit input task (d = 0.27) or a simple warning prompt While in the control condition, only 44% of the participants remained
(d = 0.13). This task-based delay was the only type of intervention for after 7 days, the additional information led to 79% survival, and the
which an increase in time on distraction was reported, indicating that additional options condition to 80% survival.
participants compensated for the initial hurdle of opening an app by
spending more time in it. Still, the aspect of being able to negotiate the
lift of a block was preferred#13a to time-based blocking, and participants 5.3 | Confidence in the results and study quality
activated the intervention more often (d = 2.06) and started distraction
less often (d = 0.82) than with purely time-based blocking inter- In this section, we present our assessment of the study quality and the
vention#13b. However, the chatbot also suggested the initiation of addi- confidence that we have in the results of the studies. We start with the
tional blocking sessions, thereby potentially skewing the results. sampling criteria and the sample size, followed by the study duration,
13652729, 2021, 5, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/jcal.12581, Wiley Online Library on [08/03/2025]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
BIEDERMANN ET AL. 1227

T A B L E 4 The measures that we


ID Outcome Measure Effect size [95% CI]
subsumed under the category time that
participants spent on the monitored #2a Time on device Time in all apps 0.83 [0.34, 1.32]
device #2b Time on device Time in all apps n.s.
#3a Time on device Time in all apps n.s.
#3b Time on device Time in all apps n.s.
#4 Time on device Average daily smartphone use (+)
#7a Time on device Total time on PC 0.61 [0.24, 0.97]
#7b Time on device Total time on PC (+)
#11a Time on device Total time on smartphone n.s.
#11b Time on device Total time on smartphone n.s.
#11c Time on device Total time on smartphone n.s.
#12a Time on device Smartphone use on workdays n.s.
#12a Time on device Smartphone use on weekends 0.56 [0.13, 0.99]
#12b Time on device Smartphone use on workdays 0.35 [0.12, 0.82]
#12b Time on device Smartphone use on weekends 0.51 [0.09, 0.93]
#12c Time on device Smartphone use on workdays 0.54 [0.07, 1.01]
#12c Time on device Smartphone use on weekends 0.51 [0.10, 0.96]
#14 Time on device Total time on smartphone 0.39 [0.06, 0.72]
#15 Time on device Total time on smartphone n.s.

Note: When the reporting did not allow the conversion into Cohen's d, the measure is marked with a (+)
if a significant reduction of the time that the participants spent on the device was reported. No
significant effects for this measure are marked with a ‘n.s.’.

T A B L E 5 The measures subsumed


ID Outcome Measure Effect size [95% CI]
under the category use of the
intervention #2a Intervention use Number of goals (+)
#2b Intervention use Number of goals n.s.
#5a Intervention use Dashboard access n.s.
#5a Intervention use Widget access n.s.
#5b Intervention use Dashboard access n.s.
#5b Intervention use Widget access n.s.
#8 Intervention use Number of started sessions n.s.
#9a Intervention use Installation survival ()
#9b Intervention use Installation survival (+)
#13a Intervention use Number of sessions 2.06 [1.17, 2.94]
#13b Intervention use Number of sessions n.s.

Note: Significantly more interactions with the intervention are marked with a (+). No significant effects
for this measure are marked with a ‘n.s.’.

For the study sample we found several recruiting criteria that were should be acknowledged that a blinding of the participants is often not
mentioned in the publications (see column ‘Selection Criteria’ in possible due to the nature of the interventions.
Table 6). These included participants with an interest in reducing The sample sizes ranged from 12 to 217 participants (M = 52.18,
smartphone usage, participants that felt that they were often dis- SD = 51.76), and seven studies were conducted with less than 20 partic-
tracted, or interested in improving productivity. In the study (Lottridge ipants. Although there are no fixed criteria for sample size requirements,
et al., 2012), participants were pre-screened with the Media Multitask- studies with participants in the range of 20 or below are considered ade-
ing Index (Ophir et al., 2009), and only those who scored either high or quate to identify usability issues for usability research purposes
low in this index were included. Kim, Jung, et al. (2019) selected partici- (Caine, 2016), but these numbers are small compared to behaviour
pants according to their readiness within the Transtheoretical Model change interventions from other domains (Norman et al., 2007).
framework (Prochaska & Velicer, 1997). None of the studies reported The most frequent duration that an intervention was active for
blinding either the participants or the investigators to the conditions. It was 7 days, with a mean duration of 14.6 days (SD = 9.60). For six
13652729, 2021, 5, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/jcal.12581, Wiley Online Library on [08/03/2025]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
1228 BIEDERMANN ET AL.

TABLE 6 Study duration in days for the baseline measurement (Pre), the active period (At) and the post-intervention period (Post)

Duration

ID Pre At Post Design n Recruitment criteria


#1 7 7 - WI 14 Undergrad students; high or low
multitaskers.
#2a 7 14 - WI 35 Interest in ‘improving’ smartphone use.
#2b WI 27 Interest in ‘improving’ smartphone use.
#3a 7 35 - WI 19 Willingness to decrease smartphone use.
#3b
#4 7 7 - WI 23 Android smartphone users
#5a 10 20 10 WI 12 Interest in self-tracking and productivity
enhancement.
#5b 12
#6 - 7 - BTW & WI 104 University students
#7a 2 2 - WI 61 Office workers and students
#7b 2 2 - BTW & WI 55 Students from a computer mediated
communication class
#8 7 14 - BTW 36 University students who ‘were willing to
improve their productivity’
#9a 32 32 - WI 217 Online users
#9b 10 10 - WI 93 Online users
#10a 7 7 7 WI 19 Crowdworkers
#10b WI 15 Crowdworkers
#11a 7 7 - WI 36 Students who wanted to reduce
#11b smartphone use
#11c
#12a 7 21 - WI 44 Students who wanted to reduce
#12b smartphone use
#12c
#13a #13b 5 10 - WI 15 Office workers
#14 - 10 - BTW &WI 161 Undergrad students
#15 21 30 - BTW 73 Smartphone users
#16a 14 14 14 BTW 39 Students often distracted by Facebook
#16b 38

Note: Whether the study design was a between subjects (BTW) design or a within subjects (WI) design. The table also contains the sample size (n), and the criteria
that were used for participant recruitment. Cells with one row for multiple interventions indicate that both interventions were used in the same study.

interventions2 we found reports about the post-intervention according to their features into the categories awareness, goal-
period. advancement, blocking, feature modification, and reward and punish-
For 10 interventions, the reporting of results did not allow a con- ment. Awareness features, which were present in 16 interventions,
version into effect sizes.3 In seven of these cases, this conversion was were the most frequent. Goal setting features were present in 10
4
not possible due to data that was not reported. For the interventions interventions, mostly represented by time-based goal setting. Nine
#11a, #11b, and #11c, the measure was prevented app openings, and, interventions had features to block access to distracting content. Five
since the baseline period had zero prevented openings, no comparison interventions modified the content of websites, and two variants of a
with a baseline was possible. single intervention used a reward scheme.
One publication made their data publicly available via the servers To address the research question regarding the effectiveness of
of the Open Science Foundation (Lyngs et al., 2020). the interventions, we first categorized the outcomes into the catego-
ries of time spent on distractions, the start of distractions, time spent
on the device, and use of the intervention itself. While positive out-
6 | DISCUSSION comes were reported for all types of interventions, the mixture of fea-
tures within some of the interventions often made it difficult to
In this review, we summarized and categorized the existing evidence pinpoint effects of individual features. This is especially true for fea-
for digital self-control interventions. We categorized the interventions tures from the awareness category, such as usage statistics, which are
13652729, 2021, 5, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/jcal.12581, Wiley Online Library on [08/03/2025]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
BIEDERMANN ET AL. 1229

often present alongside other features. We assume that the aware- interruptive approach could be to monitor learning activity in digital
ness features are so clearly over-represented because it is compara- learning environments, and infer active learning form there.
tively easy to implement them. Information about time distribution on However, even when the learning context is known, distracting
the device has to be collected for practically every intervention, one content cannot always be identified with certainty, when we consider
can create pretty visualizations out of these statistics, and it is thus learners that may need to communicate with their peers on a social
tempting to include them. Presumably, these features do no harm, but media site or watch learning-related content on the same platform
they make it difficult to establish the degree to which each feature where they watch entertaining videos. Thus, in addition to reliable
contributed to the effectiveness of an intervention. Nevertheless, sev- context detection, the classification into distracting or learning rele-
eral observations from the results stand out and should be considered vant content needs to become more sophisticated, beyond simple
for future interventions. URL- or app-based blacklists.

6.1 | Awareness and self-monitoring are 6.4 | Interventions for specific groups
insufficient
All the interventions were used with student or worker populations,
The results from those interventions which only contained awareness and none with the especially-susceptible group of adolescents (van
features indicate that relying solely on the users monitoring and der Schuur et al., 2015). Considering that this is an especially at-risk
adapting their own behaviour is not effective. This reflects the insight population, they should be the target of future research (E. J. Lee &
that people often avoid monitoring themselves, especially regarding Ogbolu, 2018).
uncomfortable topics (Chang et al., 2017). Moreover, several studies Similarly, there are differences in traits between people that may pre-
also reported greater effects for the beginning of their intervention, dispose them to respond better or worse to specific features (Mark
which suggests that a novelty effect could be at least partially respon- et al., 2018). Those factors have to be identified for adaptive interventions.
sible for the effectiveness of interventions (Tsay et al., 2019), while
habitual behaviour may gain the upper hand again in the future.
Whether it is through deliberate ignoring or habituation, it appears 6.5 | Long-term effects
that interventions that users can easily ignore do not achieve the
desired results. Another topic that needs to be addressed is how well the results from
an intervention transfer to longer lasting behaviour changes. Those
interventions in this review that address this question with follow-ups
6.2 | Circumvention difficulty can tip the scales showed no longer lasting changes.

On the other hand, interventions that are more insistent in grabbing


the users' attention at times of excessive distraction use are more 6.6 | Limitations
effective. The insistence with which an intervention attempts to con-
vince the users, and the ease with which participants can dismiss One limitation that is inherent to reviews is that publications are mis-
interventions appear to be relevant factors for the success of inter- sed due to the inclusion criteria. Many of these interventions, such as
ventions. Against the background of autonomy needs and reactance group-based locking of smartphones (Kim, Jung, et al., 2017), using
(Ryan & Deci, 2000), it makes sense to give users a way to influence smartwatches (Dibia, 2016) or a physical doll that reminds of exces-
the strictness of interventions, and the results of those studies where sive usage (Choi et al., 2016), seemed promising or used innovative
users have a choice confirm a preference for interventions that allow approaches. We also did not incorporate interviews and other forms
negotiation. At the same time, too little resistance to bypassing sanc- of qualitative analysis in this review, which often provide more
tions once again leads to ignoring the interventions completely. This nuances regarding the intervention and its reception. For those inter-
balancing act requires the ability to have adaptive sanctioning strict- ested in a particular kind of intervention, these additional insights
ness (Schwartz et al., 2021). should be considered.

6.3 | Detection of context 7 | CONC LU SION

Although the majority of the interventions were tested on students, There is no reason to assume that digital content will become less dis-
none of the included studies was conducted explicitly in a learning con- tractive, and that the attempts to grab our attention at every waking
text. To confirm the applicability in learning situations, future interven- moment will lessen. Digital self-control interventions are one of the
tions should take the context into consideration. A simple way to do ways to address this problem, but there remain a lot of open issues
this would be to ask the users what they are currently doing. A less before they can be called a solution.
13652729, 2021, 5, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/jcal.12581, Wiley Online Library on [08/03/2025]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
1230 BIEDERMANN ET AL.

ACKNOWLEDGEMEN TS Eyal, N. (2014). Hooked: How to build habit-forming products. New York,
The main author of this review is funded by the Leibniz-Kooperative Portfolio/Penguin.
Fiorella, L. (2020). The science of habit and its implications for student
Exzellenz. Open access funding enabled and organized by Projekt
learning and well-being. Educational Psychology Review, 32, 603–625.
DEAL. https://doi.org/10.1007/s10648-020-09525-1
Foulonneau, A., Calvary, G., & Villain, E. (2016). Stop procrastinating: TILT,
CONF LICT OF IN TE RE ST time is life time, a persuasive application. In Proceedings of the 28th
Australian Conference on Computer-Human Interaction – OzCHI'16.
The authors declare that there is no conflict of interest.
Launceston, Australia, ACM Press. https://doi.org/10.1145/3010915.
3010947
ENDNOTES Hiniker, A., Hong, S., Kohno, T., & Kientz, J. A. (2016, May 7). MyTime:
1
https://www.nhlbi.nih.gov/health-topics/study-quality-assessment- Designing and evaluating an intervention for smartphone non-use. In
tools. Proceedings of the 2016 CHI Conference on Human Factors in Computing
2
Systems. San Jose, CA: ACM. https://doi.org/10.1145/2858036.
#5a, #5b, #16a, #16b, #10a, #10b.
2858403
3
#3a, #3b, #5a, #5b, #9a, #9b, #7b, #11a, #11b, #11c. Holte, A. J., & Ferraro, F. R. (2020). True colors: Grayscale setting reduces
4 screen time in college students. The Social Science Journal, 1–17.
#3a, #3b, #5a, #5b, #7b, #9a, #9b.
https://doi.org/10.1080/03623319.2020.1737461
Kim, I., Jung, G., Jung, H., Ko, M., & Lee, U. (2017). Let's FOCUS: Mitigating
mobile phone use in college classrooms. Proceedings of the ACM on
DATA AVAI LAB ILITY S TATEMENT Interactive, Mobile, Wearable and Ubiquitous Technologies, 1(3), 1–29.
The data that support the findings of this study are available from the https://doi.org/10.1145/3130928
corresponding author upon reasonable request. Kim, J., Cho, C., & Lee, U. (2017). Technology supported behavior restric-
tion for mitigating self-interruptions in multi-device environments. In
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous
ORCID Technologies (Vol. 1(3), pp. 64:1–64:21). Association for Computing
Daniel Biedermann https://orcid.org/0000-0001-9219-222X Machinery. https://doi.org/10.1145/3130932
Jan Schneider https://orcid.org/0000-0001-8578-6409 Kim, J., Jung, H., Ko, M., & Lee, U. (2019). GoalKeeper: Exploring interac-
tion lockout mechanisms for regulating smartphone use. In Proceedings
Hendrik Drachsler https://orcid.org/0000-0001-8407-5314
of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
(Vol. 3(1), pp. 16:1–16:29). Association for Computing Machinery.
RE FE R ENC E S https://doi.org/10.1145/3314403
Baumgartner, S. E., Lemmens, J. S., Weeda, W. D., & Huizinga, M. (2017). Kim, J., Park, J., Lee, H., Ko, M., & Lee, U. (2019). LocknType: Lockout task
Measuring media multitasking: Development of a short measure of intervention for discouraging smartphone app use, In Proceedings of the
media multitasking for adolescents. Journal of Media Psychology, 29(2), 2019 CHI Conference on Human Factors in Computing Systems – CHI'19.
1–10. https://doi.org/10.1027/1864-1105/a000167 Glasgow, UK: ACM Press. https://doi.org/10.1145/3290605.3300927
Bayer, J. B., & LaRose, R. (2018). Technology habits: Progress, problems, Kim, Y.-H., Jeon, J. H., Choe, E. K., Lee, B., Kim, K., & Seo, J. (2016, May 7).
and prospects. In B. Verplanken (Ed.), The psychology of habit. TimeAware: Leveraging framing effects to enhance personal produc-
Springer International Publishing. https://doi.org/10.1007/978-3- tivity. In Proceedings of the 2016 CHI Conference on Human Factors in
319-97529-0_7 Computing Systems. San Jose, CA: ACM. https://doi.org/10.1145/
Caine, K. (2016, May 7). Local standards for sample size at CHI. In Pro- 2858036.2858428
ceedings of the 2016 CHI Conference on Human Factors in Computing Kirschner, P. A., & Karpinski, A. C. (2010). Facebook R and academic per-
Systems. San Jose, CA: ACM. https://doi.org/10.1145/2858036. formance. Computers in Human Behavior, 26(6), 1237–1245. https://
2858498 doi.org/10.1016/j.chb.2010.03.024
Chang, B. P. I., Webb, T. L., & Benn, Y. (2017). Why do people act like the Ko, M., Yang, S., Lee, J., Heizmann, C., Jeong, J., Lee, U., Shin, D.,
proverbial ostrich? Investigating the reasons that people provide for Yatani, K., Song, J., & Chung, K.-M. (2015, February 28). NUGU: A
not monitoring their goal progress. Frontiers in Psychology, 8. https:// group-based intervention app for improving self-regulation of limiting
doi.org/10.3389/fpsyg.2017.00152 smartphone use. In Proceedings of the 18th ACM conference on Com-
Choi, S., Jeong, H., Ko, M., & Lee, U. (2016). LockDoll: Providing ambient puter Supported Cooperative Work & Social Computing. Vancouver,
feedback of smartphone usage within social interaction. In Proceedings Canada: Association for Computing Machinery. https://doi.org/10.
of the 2016 CHI Conference Extended Abstracts on Human Factors in 1145/2675133.2675244
Computing Systems - CHI EA'16. San Jose, CA: ACM Press. https://doi. Kovacs, G., Wu, Z., & Bernstein, M. S. (2018). Rotating online behavior
org/10.1145/2851581.2892445 change interventions increases effectiveness but also increases attri-
Coyne, M. A., & Wright, J. P. (2014). The stability of self-control across tion. Proceedings of the ACM on Human-Computer Interaction, 2, 1–25.
childhood. Personality and Individual Differences, 69, 144–149. https:// https://doi.org/10.1145/3274364
doi.org/10.1016/j.paid.2014.05.026 Lakens, D. (2013). Calculating and reporting effect sizes to facilitate cumu-
Dibia, V. (2016, October 23). FOQUS: A smartwatch application for indi- lative science: A practical primer for t-tests and ANOVAs. Frontiers in
viduals with ADHD and mental health challenges. In Proceedings of the Psychology, 4. https://doi.org/10.3389/fpsyg.2013.00863
18th International ACM SIGACCESS Conference on Computers and Lee, E. J., & Ogbolu, Y. (2018). Does parental control work with
Accessibility. Reno, NV, Association for Computing Machinery. https:// smartphone addiction?: A cross-sectional study of children in South
doi.org/10.1145/2982142.2982207 Korea. Journal of Addictions Nursing, 29(2), 128–138. https://doi.org/
Duckworth, A. L., Taxer, J. L., Eskreis-Winkler, L., Galla, B. M., & Gross, J. J. 10.1097/JAN.0000000000000222
(2019). Self-control and academic achievement. Annual Review of Psy- Lee, H., Ahn, H., Choi, S., & Choi, W. (2014). The SAMS: Smartphone
chology, 70(1), 373–399. https://doi.org/10.1146/annurev-psych- addiction management system and verification. Journal of Medical Sys-
010418-103230 tems, 38(1). https://doi.org/10.1007/s10916-013-0001-1
13652729, 2021, 5, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/jcal.12581, Wiley Online Library on [08/03/2025]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
BIEDERMANN ET AL. 1231

Loid, K., Täht, K., & Rozgonjuk, D. (2020). Do pop-up notifications regard- Ryan, R. M., & Deci, E. L. (2000). Self-determination theory and the facilita-
ing smartphone use decrease screen time, phone checking behavior, tion of intrinsic motivation, social development, and well-being. Ameri-
and self-reported problematic smartphone use? Evidence from a two- can Psychologist, 55(1), 68–78. https://doi.org/10.1037/0003-066X.
month experimental study. Computers in Human Behavior, 102, 22–30. 55.1.68
https://doi.org/10.1016/j.chb.2019.08.007 Schwartz, R., Monge Roffarello, A., De Russis, L., & Apostolellis, P. (2021,
Lottridge, D., Marschner, E., Wang, E., Romanovsky, M., & Nass, C. (2012). May 8). Reducing risk in digital self-control tools: Design patterns and
Browser design impacts multitasking. Proceedings of the Human Factors prototype. In Extended Abstracts of the 2021 CHI Conference on Human
and Ergonomics Society Annual Meeting, 56(1), 1957–1961. https://doi. Factors in Computing Systems. CHI'21. Yokohama, Japan: ACM. https://
org/10.1177/1071181312561289 doi.org/10.1145/3411763.3451843
Lyngs, U., Lukoff, K., Slovak, P., Binns, R., Slack, A., Inzlicht, M., Van Shea, N., Boldt, A., Bang, D., Yeung, N., Heyes, C., & Frith, C. D. (2014).
Kleek, M., & Shadbolt, N. (2019, May 2). Self-control in cyberspace: Supra-personal cognitive control and metacognition. Trends in Cogni-
Applying dual systems theory to a review of digital self-control tools. tive Sciences, 18(4), 186–193. https://doi.org/10.1016/j.tics.2014.
In Proceedings of the 2019 CHI Conference on Human Factors in Com- 01.006
puting Systems. Glasgow, UK: Association for Computing Machinery. Shenhav, A., Botvinick, M. M., & Cohen, J. D. (2013). The expected value of
https://doi.org/10.1145/3290605.3300361 control: An integrative theory of anterior cingulate cortex function. Neu-
Lyngs, U., Lukoff, K., Slovak, P., Seymour, W., Webb, H., Jirotka, M., Van ron, 79(2), 217–240. https://doi.org/10.1016/j.neuron.2013.07.007
Kleek, M., & Shadbolt, N. (2020). ‘I just want to hack myself to not get Terry, C. A., Mishra, P., & Roseth, C. J. (2016). Preference for multitasking,
distracted’: Evaluating design interventions for self-control on technological dependency, student metacognition, & pervasive tech-
facebook. arXiv:2001.04180 [cs], arxiv 2001.04180. https://doi.org/ nology use: An experimental intervention. Computers in Human Behav-
10.1145/3313831.3376672 ior, 65, 241–251. https://doi.org/10.1016/j.chb.2016.08.009
Mark, G., Czerwinski, M., & Iqbal, S. T. (2018). Effects of individual differ- Tsay, C. H.-H., Kofinas, A. K., Trivedi, S. K., & Yang, Y. (2019). Overcoming
ences in blocking workplace distractions. In Proceedings of the 2018 the novelty effect in online gamified learning systems: An empirical evalu-
CHI Conference on Human Factors in Computing Systems – CHI'18. ation of student engagement and performance. Journal of Computer
Montreal, Canada: ACM Press. https://doi.org/10.1145/3173574. Assisted Learning, 36, 128–146. https://doi.org/10.1111/jcal.12385
3173666 Tseng, V. W.-S., Lee, M. L., Denoue, L., & Avrahami, D. (2019). Overcoming
Masood, A., Luqman, A., Feng, Y., & Ali, A. (2020). Adverse consequences distractions during transitions from break to work using a conversa-
of excessive social networking site use on academic performance: tional website-blocking system, In Proceedings of the 2019 CHI Confer-
Explaining underlying mechanism from stress perspective. Computers ence on Human Factors in Computing Systems – CHI'19. Glasgow, UK:
in Human Behavior, 113, 106476. https://doi.org/10.1016/j.chb.2020. ACM Press. https://doi.org/10.1145/3290605.3300697
106476 van der Schuur, W. A., Baumgartner, S. E., Sumter, S. R., &
Mazar, A., & Wood, W. (2018). Defining habit in psychology. In B. Ver- Valkenburg, P. M. (2015). The consequences of media multitasking for
planken (Ed.), The psychology of habit. Springer International Publishing. youth: A review. Computers in Human Behavior, 53, 204–215. https://
https://doi.org/10.1007/978-3-319-97529-0_2 doi.org/10.1016/j.chb.2015.06.035
Moher, D., Liberati, A., Tetzlaff, J., Altman, D. G., & The PRISMA Group. Varol, F., & Yıldırım, E. (2019). Cyberloafing in higher education: Reasons
(2009). Preferred reporting items for systematic reviews and meta- and suggestions from students' perspectives. Technology, Knowledge
analyses: The PRISMA statement. PLoS Medicine, 6(7), e1000097. and Learning, 24(1), 129–142. https://doi.org/10.1007/s10758-017-
https://doi.org/10.1371/journal.pmed.1000097 9340-1
Norman, G. J., Zabinski, M. F., Adams, M. A., Rosenberg, D. E., Whittaker, S., Kalnikaite, V., Hollis, V., & Guydish, A. (2016, May 7). ‘Don't
Yaroch, A. L., & Atienza, A. A. (2007). A review of eHealth interven- waste my time’: Use of time information improves focus. In Proceed-
tions for physical activity and dietary behavior change. American Jour- ings of the 2016 CHI Conference on Human Factors in Computing Sys-
nal of Preventive Medicine, 33(4), 336–345.e16. https://doi.org/10. tems. San Jose, CA: Association for Computing Machinery. https://doi.
1016/j.amepre.2007.05.007 org/10.1145/2858036.2858193
Okeke, F., Sobolev, M., Dell, N., & Estrin, D. (2018). Good vibrations: Can a Willems, Y. E., Boesen, N., Li, J., Finkenauer, C., & Bartels, M. (2019). The
digital nudge reduce digital overload? In Proceedings of the 20th Inter- heritability of self-control: A meta-analysis. Neuroscience and Biobehav-
national Conference on Human-Computer Interaction with Mobile ioral Reviews, 100, 324–334. https://doi.org/10.1016/j.neubiorev.
Devices and Services –MobileHCI'18. Barcelona, Spain: ACM Press. 2019.02.012
https://doi.org/10.1145/3229434.3229463 Wood, W., & Rünger, D. (2016). Psychology of habit. Annual Review of Psy-
Ophir, E., Nass, C., & Wagner, A. D. (2009). Cognitive control in media mul- chology, 67(1), 289–314. https://doi.org/10.1146/annurev-psych-
titaskers. Proceedings of the National Academy of Sciences of the United 122414-033417
States of America, 106(37), 15583–15587. https://doi.org/10.1073/
pnas.0903620106
Parry, D. A., & le Roux, D. B. (2019). Media multitasking and cognitive con- SUPPORTING INF ORMATION
trol: A systematic review of interventions. Computers in Human Behav- Additional supporting information may be found online in the
ior, 92, 316–327. https://doi.org/10.1016/j.chb.2018.11.031
Supporting Information section at the end of this article.
Petticrew, M., & Roberts, H. (Eds.). (2006). Systematic reviews in the social
sciences. Blackwell Publishing Ltd. https://doi.org/10.1002/
9780470754887
Prochaska, J. O., & Velicer, W. F. (1997). The transtheoretical model of How to cite this article: Biedermann, D., Schneider, J., &
health behavior change. American Journal of Health Promotion, 12(1), Drachsler, H. (2021). Digital self-control interventions for
38–48. https://doi.org/10.4278/0890-1171-12.1.38 distracting media multitasking - A systematic review. Journal of
Rosen, L. D., Mark Carrier, L., & Cheever, N. A. (2013). Facebook and
Computer Assisted Learning, 37(5), 1217–1231. https://doi.
texting made me do it: Media-induced task-switching while studying.
Computers in Human Behavior, 29(3), 948–958. https://doi.org/10. org/10.1111/jcal.12581
1016/j.chb.2012.12.001

You might also like