Tiktok Recommendation Algorithm
Tiktok Recommendation Algorithm
research-article2022
NMS0010.1177/14614448221138973new media & societySiles et al.
Article
Abstract
This article analyzes algorithm awareness as a process—a series of activities intended to
reach a goal over time. It examines how a group of Costa Ricans understood, felt about,
and related to TikTok and its algorithms as they began using the app for the first time.
Data come from diary entries completed by 43 participants about their use of TikTok over
a month and seven focus groups with these diarists. The article discusses five activities
through which users expressed developing forms of awareness of TikToks’ algorithms
and enacted various rhythms in the experience of the app: managing expectations about
what TikTok is and how it works; “training” the app; experiencing a sense of algorithmic
personalization; dealing with oscillations in the pertinence of recommendations; and
showing various forms of rejection of TikTok. The article then considers some implications
of bringing time to the fore in the study of algorithm awareness.
Keywords
Algorithms, awareness, diaries, folk theories, Latin America, personalization, process,
rhythm, temporality, TikTok
Introduction
Awareness of algorithms is key in how people understand, feel about, and relate to
platforms. Algorithm awareness also matters as it is tied to an increased capacity to
limit manipulation and exploitation through data extraction practices, protect human
autonomy and privacy, refine Internet skills, and develop critical thinking abilities
Corresponding author:
Ignacio Siles, School of Communication, University of Costa Rica, San José 11501-2060, Costa Rica.
Email: [email protected]
2 new media & society 00(0)
(Dogruel et al., 2021; Gruber et al., 2021; Oeldorf-Hirsch and Neubaum, 2021; Shin
et al., 2022).
Despite its manifold contributions, research on users’ relationship with algorithms has
often espoused a view of awareness as a relatively static possession that people either
have or not. However, as Dogruel et al. (2021) note, “algorithms are subject to evolve-
ment over time and their mutual shaping with user interactions make them difficult to
approach using predefined assumptions about factual knowledge” (p. 16). Focusing on
awareness as a process—a series of activities intended to reach a goal over time—would
thus make it possible to identify the stabilities and changes in how users think about, feel
about, and relate to algorithms. To implement this approach, in this article, we ask, how
do people develop an awareness of TikTok’s algorithms over time? What activities char-
acterize the process of establishing a relationship with the app and its algorithms? Data
come from diary entries completed over a month by 43 new TikTok users aged 18–
66 years old from different sociodemographic backgrounds and seven focus groups with
these participants.
Defined by its parent company as “the leading destination for short-form mobile
video [. . .] to inspire creativity and bring joy” (TikTok, 2020), TikTok offers a compel-
ling case for a study of algorithm awareness. The platform allows users to post videos of
up to 10 minutes. Users can then access these videos through their “For You” page,
described on TikTok’s website as an algorithm-powered “stream of videos curated to
your interests” that constitutes “the magic” of the app (TikTok, 2020). Although it is also
possible to “follow” specific profiles, TikTok emphasizes both materially and discur-
sively the centrality of algorithms in the experience of the app. There is a generalized
belief that, compared to other platforms, TikTok has the most “aggressive” and “addic-
tive” algorithms (Siles and Meléndez-Moran, 2021; Schellewald, 2021). In a succinct
explanation, TikTok indicates that its “recommender system” prioritizes the following
three main factors: user interaction metrics, metadata about the videos, and “device and
account settings” (TikTok, 2020). Kang and Lou (2022) elaborate,
TikTok applies natural language processing to identify textual and audio elements (e.g., sounds)
of the videos that users enjoyed, computer vision to classify the videos’ visual components, and
analysis of the hashtags and captions connected to such videos. TikTok’s algorithm is so
powerful and aggressive that it can learn the vulnerabilities and interests of a user in less than
40 minutes. (p. 4)
Temporality is thus the key in the experience of TikTok, from the incomparable speed
at which users feel its algorithms react to their actions, to the unique pace at which
people swipe content to signal their interests and preferences to these algorithms
(Boffone, 2022).
We begin by discussing recent scholarship on algorithm awareness and the tempo-
rality of social media, and the particularities of our research design. We then examine
five activities through which users expressed developing forms of awareness of
TikToks’ algorithms and enacted various rhythms in the experience of the app: man-
aging expectations about what TikTok is and how it works; “training” the app; expe-
riencing a sense of algorithmic personalization; dealing with oscillations in the
Siles et al. 3
Research design
The use of TikTok has grown exponentially in Latin America over the past 2 years. Some
sources estimate it is the third most used algorithmic platform in the region, after
Facebook and Instagram (LABS, 2022). More specifically, Costa Rica offers various
characteristics that make it an ideal case for conducting this study. The country is a
regional leader in the use of various apps and is one of the most digitized countries con-
sidering infrastructure, culture, and use. This condition creates fertile grounds for such
algorithmic platforms as TikTok (Siles, 2023).
We began our study by sampling individuals who responded to various criteria. We
looked for participants of different age groups (rather than relatively young informants
only) and people with various degrees of education (rather than college students or par-
ticipants with college degrees only). In addition, we focused on individuals who had
never used TikTok under the premise that their unfamiliarity with the app made it easier
to recognize the development of algorithm awareness.
We circulated a call for participation on the social media profiles of the university
where the research was conducted. Interested participants filled out an online question-
naire that inquired into their previous experiences with TikTok, their use of social media
platforms, and background sociodemographic information. We selected 43 individuals
for the study: 12 participants were aged 18–29 years; 21 were aged 30–39 years; five
were aged 40–49 years; and five were aged 50–66 years. The average age of participants
was 34 years old. In total, 29 participants identified themselves as women and 14 as men.
The sample of participants included people without formal education, college students,
and professionals. Participants were familiar with the use of other algorithmic platforms.
All of them indicated they regularly used Facebook, Instagram, and YouTube. Most said
they also used Netflix and, to a lesser extent, Spotify and Pinterest.
We then asked selected participants to share diary reports of TikTok use. In keeping
with previous studies (Moe and Ytre-Arne, 2021; Risi et al., 2020), we asked participants
to share daily reports during the first 2 weeks of the study and invited them to submit
weekly reports instead over the following 2 weeks. Of 43 participants, nine stopped
Siles et al. 5
sending their reports at some point. When this occurred, we inquired into their reasons
for abandoning the study and thus turned the non-use of the app into a finding in itself.
The remaining 34 participants submitted a total of 275 diary reports. Data collection of
diary entries took place between February and March 2022. We offered participants two
options to submit their responses: an online form where they could share written obser-
vations and WhatsApp chats where they could submit audio recordings. We transcribed
the audio messages we received.
We triangulated our research methods to minimize “the inevitable artificiality and
performative dimension of the diary process” (Markham and Couldry, 2007: 675). Thus,
we conducted seven focus groups with participants: four before they submitted their
diary reports and three afterward. In the initial focus groups, we explained to participants
our research design and clarified questions regarding the submission of the reports. We
also asked participants to discuss their motivations to take part in the study and to
describe their perceptions about TikTok before they began using the app. These conver-
sations took place in February 2022, lasted for an average of 35 minutes, and were con-
ducted on Zoom (because of limitations to carrying out in-person meetings in the context
of COVID-19). The focus groups that took place after the submission of diary reports
served primarily to triangulate our data and add nuance and context to our findings.
These conversations, also conducted on Zoom, took place in March 2022 and lasted for
an average of 52 minutes.
All the data were collected in Spanish (translations are our own). To analyze the
data from the questionnaire, diaries, and focus groups, we employed an abductive
approach that “rested on the cultivation of anomalous and surprising empirical find-
ings against a background of multiple existing [. . .] theories and through systematic
methodological analysis” (Timmermans and Tavory, 2012: 169). During an initial
round of open coding, we identified the various expressions of awareness manifested
by participants, and compared them to similar findings in previous studies. We then
examined these forms of awareness through axial coding, incorporating a process
dimension designed to make visible both stabilities and changes in participants’
awareness over time and to further identify the novelty of our findings. Finally, during
a round of selective coding, we developed five categories (which we conceptualized
as activities) to capture the shifting patterns of cognitive, affective, and practical
awareness we identified in previous rounds of coding. During this stage, we also
accounted for the rhythms enacted in participants’ relationships with TikTok in light
of previous theoretical categories.
Expectations
Awareness does not emerge in a vacuum but is rather a product of broader contexts. We
explored these contexts by analyzing the expectations that participants had about TikTok
before they started using it. These expectations provided the background against which
participants sought to achieve the goal of establishing a relationship with TikTok.
A sense of disinterest in TikTok prevailed in people’s responses to our preliminary
online questionnaire. Given its origins (as Musical.ly) and how it originally targeted
adolescents, many participants envisioned TikTok as an app made for young people, a
group of users they associated with notions of shallowness and irreflection. When asked
to define what TikTok meant for them when they signed up for the study, it was common
for participants to express such ideas as, “[It is] a platform for young people” or “for
kids,” or a “juvenile” app. In his questionnaire response, Federico also emphasized the
format of the content: “It’s an app for short [decontextualized] videos.” The combination
of young users and viral videos turned TikTok into what one participant defined as “the
incoherence of sanity and the debauchery of stupidity.” In this view, TikTok was an app
for people who were unable to understand for themselves the wider implications of the
content they watched, and who were mostly interested in vain, futile, and frivolous
things. These ideas reveal how depersonalized the relationship with TikTok was for par-
ticipants before they started using the app: TikTok was for others (specifically a vulner-
able group of people) but not for them.
This indifference toward TikTok can be conceptualized as desynchronized tempo-
ralities (Jordheim and Ytreberg, 2021). For our interlocutors, there did not seem to be
a place for TikTok’s fast pace in the rhythm of their own daily life. As some people
expressed in the preliminary questionnaire, TikTok seemed a “very time-consuming”
app with “accelerated content.” A few even mentioned the word “addictive,” thus sug-
gesting that TikTok prevented people from freely managing their time. Larissa thus
acknowledged in the questionnaire: “I have never used [TikTok], I think I would not
stop watching [videos].” Against the background of a cultural mandate to always be
productive and the futility attributed to the app, many participants did not hesitate to
call TikTok “a waste of time.”
Participants nuanced these accounts during the focus groups carried out before they
began completing the diaries. During these conversations, many admitted being curious
about TikTok, which most considered “fashionable” at the time of the study. For some,
this created pressure to learn more about the app. Most participants narrated instances
where they had seen close people in their social circles using TikTok (children, col-
leagues, friends), which had sparked some interest in it. The experiences of these early
adopters revealed to participants that the app could contain content of potential value to
them. Learning to communicate with other users through the app’s technological and
cultural codes was an important motivation for some participants. That was the case of
Sonia, a 66-year-old translator who said she wished to better understand her teenage
grandchildren.
On several occasions during focus groups, participants framed our study as a legiti-
mate opportunity for them to use an app they were curious about but had mostly discred-
ited. “I was looking for an excuse to use TikTok!” said Ricardo, a 63-year-old woodworker.
Siles et al. 7
Participants defined this “excuse” in utilitarian terms: TikTok could provide them with
valuable information for their professional activities. In exchange for their participation,
they hoped to obtain resources to fulfill a personal or professional purpose.
Except for one person, none of the participants mentioned the term “algorithm” in the
preliminary questionnaires or initial focus groups. This does not mean that participants
lacked awareness but rather suggests that algorithms were not necessarily central in
accounting for their perception of the app before they began using it. Although some
expressed ideas about how the app’s recommendations worked (so-called “folk theo-
ries”), they did not use the term “algorithm” to articulate those ideas nor centered their
explanations on the specific role of algorithms.
Training
Creating a rhythm between the desynchronized temporalities of TikTok’s time-consum-
ing pace and the productive daily life of diarists took work (cf. Jordheim and Ytreberg,
2021). This rhythm manifested primarily as “repetition” (Carmi, 2020): engaging in a set
of recurrent practices to produce a visible result. Participants understood this repetitive
work as (mutual) “training”: they had to strategically act to receive interesting recom-
mendations on TikTok, while TikTok had to work to adjust to their preferences. In a
focus group conversation, Larissa summarized the gist of the training activities: “I had to
learn how to make the algorithm work in my favor.” In this sense, training is comparable
to what other researchers have termed “domestication”: users “tame” algorithms as they
find value for them, develop routines around their use, and construct perceptions of the
world based on them (Siles, 2023; Simpson et al., 2022).
Most diarists indicated that the “For You” page was their default starting point when
they began using TikTok, thus situating algorithms at the core of their experience. Some
reported that their preferred strategy to begin training the app was to search for specific
profiles or keywords. Matilde (a 30-year-old saleswoman) captured the purpose of this
practice with precision in her first diary entry:
It seems to me that you can find good content on tik tok [sic] (IF YOU LOOK FOR IT) but I
get the impression that the algorithm is there to recommend content with the most views,
regardless if the content is good or bad. (Caps in original)
Matilde’s words (and explicit use of caps) reveal how diarists thought it was their obliga-
tion to engage in training activities to revert TikTok’s default tendency to recommend
popular videos (as opposed to personal content).
It was common for participants to identify patterns in the kinds of recommendations
they received in their initial interactions with TikTok. In his very first report, Federico
revealed his surprise when TikTok recommended a video made by someone in the same
place where he lives in the north of Costa Rica. Two days later, he reported how he
thought TikTok was showing him content based on the profiles of users he had previ-
ously watched, under the assumption that this revealed his preferences.
Finding ideal times and places to use the app was key in training TikTok’s algorithms.
Scholars in the domestication tradition have referred to this process as the “incorporation”
8 new media & society 00(0)
I used [TikTok] several times throughout the day. Maybe I was at work and wanted a mental
break, so I grabbed and checked TikTok, Instagram, and Facebook. At lunchtime, I always
checked it a couple of times. Before going to bed, I looked at [TikTok] a little bit.
These routines shaped awareness in different ways. On one hand, diarists used TikTok
for relatively short periods to pass the time while doing something else (a work break,
waiting in line at the bank). That was the case of Federico, who reported using TikTok
for periods of approximately 10 minutes at a time. This typically occurred in public
spaces, which made participants more attentive to content that could bother others around
them. On the other hand, participants devoted time to specifically using the app. The
most common example given by diarists was using the app “before going to bed.” Larissa
thus waited until the end of the day to use TikTok and indicated that she watched videos
for periods that often exceeded 1 hour. This routine often took place in private spaces,
which participants considered ideal to assess the app’s recommendations and identify the
types of content they liked.
Participants integrated TikTok into their daily social media routines and created fertile
grounds for comparisons across platforms. Diarists’ early understanding of TikTok
developed precisely as they identified similarities and differences with other apps that
they were familiar with, most notably Instagram. In her first report, Matilde noted, “My
first impression is that [TikTok is] about videos to entertain; the more ridiculous they are,
the more attention they attract and the more followers they get. It was very similar to the
Reels I see on Instagram.” Since she had used Instagram prior to encountering TikTok,
Matilde assumed the content she found on Reels was more characteristic of Instagram
than any other app.
Training activities were associated with specific emotions in participants. Federico
wrote in his sixth entry that the uncertainty of training activities made him feel “inse-
cure.” Many diarists found this activity frustrating because it required constantly invest-
ing time and effort without perceiving immediate results. Participants also used the
notions of “boredom” to describe this activity and “disinterest” to recount its results (cf.
Lupinacci, 2022). Many diarists also reacted emotionally against the persistence of what
they considered biases in their “For You” page. Recalling this process, Florencia
(41-year-old teacher) said during a focus group: “At first, it was horrible. There were
very sexualized videos, which made me feel uncomfortable. I didn’t like it.”
Compared to initial questionnaires and early focus groups, many more participants
mentioned the term “algorithm” in the diary reports to describe training activities. The
emergence of the word “algorithm” (often in the singular) in participants’ vocabulary
added some precision to what had been a generic description of TikTok. “The algorithm”
became an umbrella term to refer to what participants thought were TikTok’s specific
computational procedures to recommend content. It was also common for diarists to use
“TikTok” and “algorithm” almost interchangeably, consistent with a view of TikTok as
an assemblage, an inseparable tissue of relationships between app, algorithms, and users
(Siles and Meléndez-Moran, 2021).
Siles et al. 9
Some of the ideas that diarists developed about algorithms came from the active use
of TikTok. In their reports, diarists described algorithms as computational mirrors of
their actions. They interpreted algorithms as reactive entities that responded to their
training practices in a precise manner. In her second diary report, Sonia noted, “As I
interact with the application, it adjusts to my tastes. [. . .] It has one formula to detect
trends and prioritizes this content, in addition to taking into account user interactions.” In
this account, it is the users who lead the process by shaping the behavior of algorithms
through their actions.
Awareness of algorithms during training activities also led to gradually personal-
izing the relationship with the TikTok assemblage. To describe their interactions with
TikTok, diarists employed terms that suggest a change in their rapport with the app
and its algorithms, compared to their initial expectations. After her third report,
Larissa began employing a vocabulary that suggested she was engaging in a dialogue
with TikTok through her training practices. She interpreted each recommendation as
a response to her interventions as part of this dialogue (enacted by such practices as
liking certain content, spending more time watching videos, or avoiding engaging
with specific content). Although diarists assigned algorithms a more reactive role,
they also emphasized how capable and precise they thought these were in responding
to training.
Participants also integrated “exogenous” sources of knowledge into their awareness
of TikTok’s algorithms. Some knowledge came from comparisons between TikTok and
other apps (cf. Espinoza-Rojas et al., 2022). Larissa said during a focus group that her
understanding of algorithms came from her previous experiences with Facebook. Diarists
also drew on articles in media outlets or cultural products to interpret the role of TikTok’s
algorithms. Agustina, a 51-year-old journalist, narrated how various Netflix documenta-
ries had shaped her understanding of algorithms:
What I learned is that apps track your behavior and compare it statistically with all the other
users. [The goal] is to attract [people’s] attention, to make them use the apps again, to make
room for more advertising, to make them watch other things. Personally, I don’t demonize
[algorithms]. Sometimes they’re useful.
Agustina’s premise was that the algorithms of different platforms (including TikTok)
perform the same function: to manipulate users. But she also relativized the importance
of this manipulation by suggesting she remained in control of algorithms and by high-
lighting the services they could provide to her.
Personalization
After a few reports that focused mostly on describing training activities, diaries evinced
a relatively different understanding of algorithms. Diarists began noting that training
seemed to have finally worked. For Larissa, this occurred on the third day of using
TikTok. She expressed this with enthusiasm in her diary entry: “This time I liked [TikTok]
better because the algorithm is already giving what I like.” Federico reported a similar
experience also on the third day of use: “I accepted it a little bit more. I got carried away.”
10 new media & society 00(0)
Four days later, he went further: “I [arrived to] moments when I laughed inside [and] the
joy overflew without thinking about it.”
Lupinacci’s concept of “harmony” aptly describes the “algorhythm” enacted in this
activity. For Lupinacci (2022), a harmonious rhythm takes place when “the content and
the people shown to you first are those attuned to your individual preferences and past
engagement” (p. 9). Harmony also manifests in the sensation that algorithms produce
“right-time” recommendations in ways that seem anticipatory (Bucher, 2020).
Whereas the notion of training situated people as those leading the relationship with
TikTok, participants now felt they could “sit back” and enjoy the rhythm of harmonious
algorithmic personalization. Roles and responsibilities shifted during this activity: algo-
rithms offered them content and participants reacted to these recommendations. Florencia
explained during a focus group: “The content kept changing until I suddenly realized that I
had spent an hour watching videos.” In the accounts of Florencia and Federico (cited above),
it was algorithms that made them lose control of time while they focused entirely on the app.
As Siles (2023) has noted, it would be misleading to describe changes in users’ under-
standing of their agency as a move to passive roles. Instead, Siles (2023) refers to this
state as “active passivity,” which Gomart and Hennion (1999) define as “a movement in
which loss of control is accepted and prepared for [. . .] the abandonment of forces to
objects and the suspension of the self” (p. 227). Reaching this state is thus experienced
as an accomplishment. Personalization is about letting algorithms act on users after users
have acted on algorithms (Siles and Meléndez-Moran, 2021).
Both in their reports and in posterior focus groups, many participants highlighted the role
of algorithms (using explicitly the term) in explaining how recommendations had improved
in such visible ways. In Violeta’s words, “The algorithm might have read my profile a little
better and what I’ve liked. It [also] read that I didn’t like certain content.” This comment
shows again the precision that diarists attributed to TikTok’s computational capacities.
TikTok confirmed expectations derived from their experiences both with this app and other
platforms, and ideas in public discourse about the operation of algorithms.
Many participants expressed the sense that they had achieved personalization rela-
tively fast. As with Larissa’s and Federico’s, most diary reports included statements of
this nature after the third or fourth day of use. Florencia noted in her second report that
TikTok was still in “default mode.” The following day, she wrote instead, “The algorithm
is starting to work. I don’t hate [TikTok] as much.” In her sixth report, she added, “The
algorithm is getting to know me 😏😅” (emphasis added). In this perspective, TikTok was
not a depersonalized entity with which they had no relationship any longer but a personal
platform that had something specifically for “me.” The emoji Florencia used conveyed
how she felt obligated to acknowledge that the app’s algorithms seemed to have worked
and the pleasure she derived from this result. Participants thus felt interpellated by
TikTok’s algorithmic recommendations. In this context, interpellation refers to “the work
embedded in algorithms to convince [people] that [platforms] are speaking directly to
them, ‘hailing’ them in particular ways” (Siles, 2023: 35). As Cohn (2019) notes, algo-
rithmic interpellation functions when users begin thinking in the terms offered through
the interface of a platform.
Diarists showed much more positive emotions to describe their awareness of algo-
rithmic personalization compared to other moments. Despite her initial contempt for
Siles et al. 11
the app, Florencia began feeling “more comfortable” with the content she received
after the third day of using TikTok. Similarly, Mónica, a 38-year-old babysitter, wrote
in her third diary entry: “I don’t find TikTok so boring anymore!” and mentioned she
felt “happiness!” when using it. For diarists, this change in their affective state justified
a sustained use of the app: “I did like it today. I will keep using it,” wrote Clara in her
second diary entry.
Participants described how they engaged in new use practices as a reaction to relevant
recommendations they received once they had reached personalization. A telling exam-
ple is how diarists began sharing and exchanging videos with others. In her sixth diary
entry, Larissa recounted, “[Today] I got things that I was able to share. It was very enter-
taining!” For Larissa, sharing content with others for the first time was only possible
because TikTok recommended her videos that were worth sharing. She thus understood
her own practice as a fitting response to the quality of algorithmic recommendations she
had begun to receive. Becoming aware of changes in algorithmic operations thus moti-
vated her to act differently compared to previous days.
Oscillations
Some participants noted that they permanently enjoyed such a state of harmonious algo-
rithmic personalization. In their reports, these diarists indicated they continuously
received video suggestions that met their interests. Alvaro, a 23-year-old college student,
used the words “joy” and “laughter” repeatedly in his reports until the end of the study.
But, for most diarists, recommendations oscillated between videos they liked and others
they did not, even after having reached personalization. Larissa thus noted in her seventh
diary entry, only a few days after her enthusiastic assessment of TikTok’s algorithms
reported in the previous section: “It’s very ‘choppy’ (revuelto), [there is] a lot of every-
thing.” She elaborated on her report the following day: “Whereas other days I received
very good suggestions and I took the time to watch them, I didn’t like yesterday’s sug-
gestions very much. I don’t know why!”
The “algorhythm” enacted in this activity could be described as “oversaturation,” that
is, the sensation that people had to deal with too many and very mixed recommendations
at an exceedingly fast pace. In other words, some participants felt the pace of TikTok’s
algorithms surpassed the rhythm of their lives. How participants experienced this rhythm
is nowhere clearer than in the words of Violeta, who wrote, “Too much movement and
everything is very fast. It is too dynamic for my taste.”
Becoming aware of changes in algorithms led our interlocutors to express another
fundamental expectation: stability. Most diarists assumed they would continue to receive
recommendations they found germane once they felt personalization had been achieved.
When this expectation was not met, they manifested feelings of surprise and confusion.
In response to oscillation in her recommendations, Violeta wrote in her 10th report: “I
feel there is a setback.” Unstable recommendations thus went against the premise that
algorithms should be stable and precise rather than variable and prone to errors that users
felt had been overcome.
This belief in algorithms as entities that always perform the same tasks in an unalter-
able manner confirms studies that have shown how users increase their trust in
12 new media & society 00(0)
algorithmic systems when they feel these systems do not “violate” their expectations
(Kizilcec, 2016). Participants did not feel the need to further decipher how TikTok’s
algorithms worked unless one of their expectations was not met. Whereas diarists elabo-
rated on how they thought that algorithms worked during training and personalization
activities, they reduced their observations on this issue after various days of sharing their
reports. Once they thought they had figured out how to reach personalization, partici-
pants did not expect algorithms to change and thus did not find it always necessary to
keep analyzing why algorithms were offering different recommendations. Thus, it was
common for diarists to use such expressions as “nothing new to report” or “just like what
I said yesterday” in their accounts of why they thought they were receiving certain rec-
ommendations on TikTok.
Unlike with training, in which participants thought of algorithms as entities that mir-
rored their actions with precision and immediacy, during oscillation, they interpreted
algorithms as responding primarily to “external” forces, such as viral trends or contex-
tual issues. Agustina explained how she thought that TikTok was integrating more viral
trends into her recommendations compared to the beginning:
I have the feeling that, if there is content that is going viral and is not part of my list of interests,
it must be based on information from many users who are watching it or who want to know
more about it. Or because of advertising issues. Those were two things that made me want to
be a little more distant [from TikTok].
Agustina’s “distance” toward TikTok came from a premise that, once a “list of interests”
had been defined, algorithms should not depart from it.
In a similar manner, diarists noted that, compared to the early days of the study, algo-
rithms were recommending more videos to them about specific events (such as the war
in Ukraine or the Costa Rican presidential runoff election.) These recommendations
made participants feel connected to others in the simultaneous experiences of “media
events” that took place in the hybrid formats of “real-time” (including television and
social media; Sumiala et al., 2018). Noticing how these events unfolded on TikTok also
shaped the awareness of diarists. During a focus group, Agustina accounted for the
repeated presence of videos about one presidential candidate on her “For You” page by
noting it could only be because of sponsored posts or because “the algorithm” was rec-
ommending what other people “with the same profile” were interested in.
For several participants, the expectation derived from assessing one video after the
next made the app worthwhile and the experience of using it exhilarating, even if uncom-
fortable. “I feel this is vice-inducing,” wrote Larissa. These participants developed, as it
were, a feel for the rhythm of “oversaturation” on TikTok.
Diarists offered two rationales for continuing to use TikTok despite algorithmic oscil-
lation. The first rationale emphasized utilitarian reasons. As noted earlier, previous
expectations provided the background against which participants assessed the merits of
algorithms. Accordingly, some participants argued they valued TikTok because it pro-
vided them with information that was useful in their personal or professional lives. In her
seventh report, Larissa marveled at the number of ideas she had received by watching
videos on TikTok: “If you want to start a business, there is a lot out there!” She
Siles et al. 13
mentioned having watched insightful tips about how to create soaps, dog biscuits, and
personalized gifts. This statement drastically differed from claims made by participants
before using TikTok. Rather than a frivolous app made for gullible others, TikTok was
now a tool for “finding information and useful content about things to do,” specially
made for each one of them, as Violeta put it.
In addition to utility, participants emphasized a logic of complementarity. In this per-
spective, TikTok was unique compared to other platforms, different enough to secure a
place in the social media ecologies of participants. That was Federico’s case, who indi-
cated in his ninth diary report that he was already “using TikTok more than Facebook and
Instagram” because he thought it was “better to watch a video in which the content has
not been altered so much.” By the eighth day of using TikTok, Larissa arrived at a similar
conclusion: TikTok offered her videos she noted, “could not find on Facebook” and won-
dered whether this difference could be explained by a difference in the quality of algo-
rithms: “Maybe Facebook’s algorithm isn’t as good as TikTok’s.”
Rejections
Thus far, we have discussed how the use of TikTok led to various forms of algorithm
awareness over time. But awareness was also tied to the non-use of the app. Throughout
the study, some diarists manifested their desire to stop using TikTok. The most common
reason they advocated was the frustration derived from training activities. For these par-
ticipants, the rhythm of repetition had become tiresome. Karppi et al. (2021) aptly
describe how awareness is tied to the search for disconnection as it leads to “an awaken-
ing of a counter-mood with a desire to act and to knowledge of what to do” (p. 1604).
That was precisely Federico’s case. In his final entry, exactly 1 month after he began
using TikTok, he wrote, “There hasn’t been any new content for days, nothing that
catches my attention.” This led him to a concrete desire for disconnection as a temporal
rhythm to relate to the app: “I have voluntarily used [TikTok] less,” he indicated in the
same report. Other participants noted they also had become “bored” of the app because
it kept recommending videos that were not of interest to them. Nothing they had done
seemed to have changed this pattern.
Other diarists who considered stopping using the app after the end of the study focused
on “oversaturation” and the oscillation of content between what they deemed interesting
and uninteresting video recommendations. For these participants, TikTok’s “choppy”
rhythm and deviation from their expectation of algorithmic stability proved difficult to
surpass. Framing it as an affective matter, Sonia, the 66-year-old translator, argued in a
focus group discussion that she had felt gradually more distant from the app: “I kept getting
cold. It got to a point where it got so cold I didn’t feel like using [TikTok] anymore.” For
Sonia, rejecting TikTok thus became a means to reassert her agency in relation to the app.
One final reason for rejecting TikTok was provided by those who felt the app was not
offering them anything sufficiently different from what they could find on other plat-
forms. Ricardo, the woodworker, offered a detailed explanation of this during a focus
group discussion:
I used [TikTok] but it didn’t give me the same feeling that Instagram does. It seemed to me that
it recommended very meaningless things. I think that TikTok comes at a time in my life when
14 new media & society 00(0)
I am already in love with Instagram and it didn’t make me break my relationship with Instagram,
an app that I’ve had since it started, that I’ve always liked, and where I think I can find more
friends with things in common.
After evaluating TikTok’s algorithms for a month, participants like Ricardo found
them wanting. Time was key in their explanations: TikTok came at a moment when their
affective ties to other platforms had already been formed. Ricardo’s words are a reminder
that algorithm awareness takes place as part of larger ecologies where platforms acquire
meaning and identity over time (Espinoza-Rojas et al., 2022).
Finally, our study helped broaden the understanding of the temporality of social media
platforms in two main ways. First, we demonstrated how a specific platform can orches-
trate multiple temporal rhythms. Most scholars have tended to assign one main kind of
pace to each specific platform (Weltevrede et al., 2014). As a supplement, we showed
how our interlocutors experienced various rhythms (desynchronicity, repetition, har-
mony, oversaturation, disconnection) in their relationship with the same app.
“Algorhythms” should thus be considered as temporal enactments in themselves rather
than a fixture of digital devices or given time regimes.
Second, we revealed how the experience of these rhythms was also tied to various
forms of awareness. Our analysis thus helped expand preliminary findings in the litera-
ture. In an examination of the “algorhythms” of media ecologies in London, Lupinacci
(2022) showed how people tended to be more aware of ephemerous content because of
the risk of not being able to watch it again. Our study provided further empirical evi-
dence of this process by examining how the experience of algorithmic rhythms shaped
forms of cognitive, affective, and practical awareness. Diarists developed different forms
of algorithm awareness based on whether they felt their relationship with them was
desynchronyzed, repetitive, harmonious, overly saturated, or if they considered they
needed to disconnect from the app.
Studying the experiences of people during 1 month made it possible to recognize very
specific activities that might have been difficult to identify otherwise. But it also might
have prevented us from examining a broader range of factors shaping awareness. Future
research could identify people’s awareness of algorithms over more extended periods.
These studies could also clarify whether there are temporal patterns (sequential, concur-
rent, and cyclical) in which people experience these five activities and others.
Acknowledgements
The authors wish to thank Kelley Cotter, Andreas Schellewald, and Brita Ytre-Arne for their inval-
uable suggestions on earlier drafts of this work. They also thank the anonymous reviewers for their
excellent comments about previous versions of this article.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/
or publication of this article: This work was supported by the Vicerrectoría de Investigación,
Universidad de Costa Rica, Project no. C0451.
ORCID iD
Ignacio Siles https://orcid.org/0000-0002-9725-8694
References
Siles I, Espinoza-Rojas J, Naranjo A, et al. (2019) The mutual domestication of users and algo-
rithmic recommendations on Netflix. Communication, Culture & Critique 12(4): 499–518.
Siles I, Segura-Castillo A, Solís-Quesada R. et al. (2020) Folk theories of algorithmic recommenda-
tions on Spotify: Enacting data assemblages in the global south. Big Data & Society 7(1): 1–15.
Siles I and Meléndez-Moran A (2021) “The most aggressive of algorithms”: User awareness
of and attachment to TikTok’s content personalization. Paper presented at the 71st Annual
Conference of the International Communication Association.
16 new media & society 00(0)
Siles I (2023) Living with algorithms: Agency and user culture in Costa Rica. Cambridge, MA:
MIT Press.
Boffone T (ed.) (2022) TikTok Cultures in the United States. New York: Routledge.
Bucher T (2020) The right-time web: theorizing the kairologic of algorithmic media. New Media
& Society 22(9): 1699–1714.
Carmi E (2020) Rhythmedia: a study of Facebook immune system. Theory, Culture & Society
37(5): 119–138.
Cohn J (2019) The Burden of Choice: Recommendations, Subversion and Algorithmic Culture.
New Brunswick, NJ: Rutgers University Press.
Cotter K (2022) Practical knowledge of algorithms: the case of BreadTube. New Media & Society
0(0): 1–20. DOI: 10.1177/14614448221081802.
Cotter K and Reisdorf BC (2020) Algorithmic knowledge gaps: a new dimension of (digital) ine-
quality. International Journal of Communication 14: 745–765.
DeVito MA, Birnholtz J, Hancock JT, et al. (2018) How people form folk theories of social media
feeds and what it means for how we study self-presentation. In: Proceedings of the 2018 CHI
conference on human factors in computing systems, Montreal, QC, Canada, 21–26 April, pp.
1–12. New York: ACM.
Dogruel L, Masur P and Joeckel S (2021) Development and validation of an algorithm literacy
scale for Internet users. Communication Methods and Measures 16(2): 115–133.
Eslami M, Rickman A, Vaccaro K, et al. (2015) “I always assumed that I wasn’t really that close
to [her]”: reasoning about invisible algorithms in the news feed. In: 33rd annual SIGCHI
conference on human factors in computing systems, Seoul, Republic of Korea, 18–23 April,
pp. 153–162. New York: ACM.
Espinoza-Rojas J, Siles I and Castelain T (2022) How using various platforms shapes awareness
of algorithms. Behaviour & Information Technology. Epub ahead of print 17 May. DOI:
10.1080/0144929X.2022.2078224.
Gomart E and Hennion A (1999) A sociology of attachment: music amateurs, drug users. The
Sociological Review 47(1): 220–247.
Gran AB, Booth P and Bucher T (2021) To be or not to be algorithm aware: a question of a new
digital divide? Information, Communication & Society 24(12): 1779–1796.
Gruber J, Hargittai E, Karaoglu G, et al. (2021) Algorithm awareness as an important internet skill:
the case of voice assistants. International Journal of Communication 15: 1770–1788.
Hargittai E, Gruber J, Djukaric T, et al. (2020) Black box measures? How to study people’s algo-
rithm skills. Information, Communication & Society 23(5): 764–775.
Jordheim H and Ytreberg E (2021) After supersynchronisation: how media synchronise the social.
Time & Society 30(3): 402–422.
Kang H and Lou C (2022) AI agency vs. human agency: understanding human—AI interac-
tions on TikTok and their implications for user engagement. Journal of Computer-Mediated
Communication 27(5): 1–13.
Karppi T, Chia A and Jorge A (2021) In the mood for disconnection. Convergence 27(6): 1599–
1614.
Kizilcec RF (2016) How much information? Effects of transparency on trust in an algorithmic
interface. In: Proceedings of the 2016 CHI conference on human factors in computing sys-
tems, 2016, pp. 2390–2395.
Kruger HA and Kearney WD (2006) A prototype for assessing information security awareness.
Computers & Security 25(4): 289–296.
LABS (2022) TikTok soars in Latin America: app hits the 100 million user mark. Available at:
https://labsnews.com/en/news/technology/tiktok-soars-in-latin-america-app-hits-the-
100-million-user-mark/ (accessed 5 May 2022).
Siles et al. 17
Author biographies
Ignacio Siles is a professor of media and technology studies in the School of Communication and a
researcher in the Centro de Investigación en Comunicación (CICOM) at Universidad de Costa
Rica.
Luciana Valerio-Alfaro is a student in the School of Communication at Universidad de Costa Rica.
Ariana Meléndez-Moran is a graduate of the School of Communication at Universidad de Costa
Rica.