A-framework-of-pre-service-teachers--conceptions-about-digi_2020_Computers--
A-framework-of-pre-service-teachers--conceptions-about-digi_2020_Computers--
A R T I C L E I N F O A B S T R A C T
Keywords: We examine the conceptions of digital literacy of pre-service teachers in the United States (n ¼
Information literacy 188) and Sweden (n ¼ 121). Pre-service teachers were asked to define digital literacy in an open-
Cross-cultural projects ended fashion and to select those skills that they considered to be essential for digital literacy
21st century abilities
from a list of 24 skills provided. Based on pre-service teachers’ open-ended responses, four
Post-secondary education
Teaching learning strategies
profiles of digital literacy conceptions, progressing in sophistication, were identified (i.e., tech
nology focused, digital reading focused, goal directed, reflecting critical use). Moreover, pre-
service teachers’ selections of skills or competencies essential for digital literacy were used in
cluster analysis. Profiles of digital literacy conceptions were consistent across open-ended and
selected-response forms of assessment. Important similarities and differences in conceptions of
digital literacy across the United States and Sweden are discussed, as are implications for
improving teacher education.
1. Theoretical frame
The Programme for International Student Assessment (PISA) defines digital literacy as students’ ability to: “evaluate information
from several sources, assessing the credibility and utility of what is written using self-established criteria as well as the ability to solve
tasks that require the reader to locate information, related to an unfamiliar context, in the presence of ambiguity and without explicit
directions” (OECD, 2015, p. 50). In 2006, the European Union launched an initiative emphasizing digital competence, based on
OECD’s identified competence areas. The definition of digital competence that emerged, referred to as DigComp 2.0 (European
Commission, 2017) has developed, from a definition that mainly focused on operational and technical “know-how” towards one that
included more knowledge-oriented cognitive, critical, and socailly responsible perspectives. Many similar definitions abound
(Buckingham, 2010; Fraillon, Ainley, Schultz, Friedman, & Gebhardt, 2014; Hobbs & Coiro, 2019; Spante, Hashemi, Lundin, & Algers,
2018).
Across these definitions there has been the shared recognition that digital literacy stands as a critical competency for today’s
learners, challenged by the technological, informational, cognitive, and socio-emotional demands of the digital age. Despite the
emphasis that has been placed on digital literacy throughout the literature, less is known about how individuals themselves may define
digital literacy and the competencies that it requires. The focus of this study is to understand how pre-service teachers, in particular,
soon to be tasked with instructing their own students in digital literacy, define or conceptualize this construct. We examine definitions
* Corresponding author. Dept. of Educational Psychology, Counseling, and Special Education, The Pennsylvania State University, United States.
E-mail address: [email protected] (A. List).
https://doi.org/10.1016/j.compedu.2019.103788
Received 1 April 2019; Received in revised form 13 December 2019; Accepted 18 December 2019
Available online 3 January 2020
0360-1315/© 2020 Elsevier Ltd. All rights reserved.
A. List et al. Computers & Education 148 (2020) 103788
of digital literacy among pre-service teachers in two national and educational contexts, the United States and Sweden, and propose a
framework for the progressive development of pre-service teachers’ conceptions of digital literacy.
We consider the examination of conceptions of digital literacy across national, cultural, and educational settings to be essential, as
preparedness in digital literacy has been considered necessary for students’ economic participation across developed nations,
including the United States and Europe, and a point of international cooperation across governments (Chetti, Qigui, Gcora, Josie,
Wenwei, & Fang, 2018; Van Dijk, 2009). The United States and Sweden were selected as national settings because of important
commonalities and differences in these nations’ approaches to digital literacy. On the one hand, these both reflect developed countries,
wherein access to technology is plentiful; on the other hand, while Sweden has demonstrated a commitment to digital literacy (i.e.,
referred to as digital competence) in its national curriculum (Swedish Ministry of Education, 2018), no similarly broad and centralized
commitment in the United States has been forthcoming for reasons of both curriculum and policy (Davis, South, & Stevens, 2018). In
this study, we examine the extent to which such commonalities and differences contribute to conceptions of digital literacy among
pre-service teachers in the United States and Sweden.
A variety of frameworks defining digital literacy and associated constructs [e.g., information literacy, Internet and communications
technology (ICT) literacy, multimedia literacy, 21st century skills] populate the literature (Alexander, The Disciplined Reading, &
Learning Research, 2012; Bawden, 2008; Spante et al., 2018; Stordy, 2015). Ng (2012), drawing together various definitions, suggests
that digital literacy arises at the intersection of students’ technical, cognitive, and socio-emotional competencies. The technical
dimension of digital literacy includes students “technical and operational skills to use ICT for learning and in every-day activities” (Ng,
2012, p. 1067). The cognitive dimension encompasses the skills students need to search for, evaluate, and create digital information as
well as students’ abilities to critically analyze this information. Finally, the socio-emotional dimension of digital literacy requires that
students be able to use ICTs for responsible communication, collaboration, and other social goals related to learning.
In a more expansive framework, Eshet-Alkalai (2004) considers students’ digital literacy to include a set of five, interrelated lit
eracies. These are photo-visual literacy, reproduction literacy, branching literacy, information literacy, and socio-emotional literacy.
Photo-visual literacy refers to students’ skills in “reading” or comprehending the graphic and other multimedia information that
characterizes content on the Internet. Reproduction literacy, or synthesis, refers to students’ skills in combining disparate pieces of
information to create a novel product. Branching literacy refers to students’ skills in navigating the range of information available
online and is particularly engaged when learners try to traverse hypertexts, characterized by their hierarchical or lateral linking to
one-another. Information literacy refers to the skills involved in analyzing and evaluating the variety of information available online
and is necessary for students to be critical consumers of said information. Finally, socio-emotional literacy refers to students’
adherence to online norms for collaboration and communication, and the social sharing of information on the Internet. In an updating
of this framework, Eshet-Alkalai (2012) added real-time thinking as a sixth literacy, corresponding to students’ competencies in
simultaneously processing a large volume of stimuli, as is done during online learning or video game play.
Beyond these efforts to define digital literacy, limited work has examined students’ own definitions of digital literacy, referred to as
conceptions in this study, due to their often componential nature, including the skills and competencies that students consider to be
essential for digital literacy. Although students’ conceptions of digital literacy have yet to be directly examined, at least three related
literatures may have a bearing on these. These are literatures on students’ attitudes toward technology adoption, beliefs regarding
their own digital literacy skills, and investigations of professional competence, within pre-service teacher populations, more
specifically.
2
A. List et al. Computers & Education 148 (2020) 103788
In this study, we examine pre-service teachers’ conceptions of digital literacy. Our focus on pre-service teachers’ conceptions is
motivated by a desire to understand how such beliefs may transfer to the instructional choices that teachers make in the classroom,
with these decisions translating to students’ academic experiences with digital literacy. We view such conceptions as the origin point
for teachers’ decision-making about curriculum, methods, and assessment when supporting students’ digital literacy development. In
examining conceptions of digital literacy, in this study we build on prior work by adopting a more culturally comparative approach. In
particular, we juxtapose the digital literacy conceptions of pre-service teachers in the United States and Sweden. Comparing con
ceptions of digital literacy in the United States and Sweden allows for an informative cross-cultural investigation for a number of
reasons. For one, these both represent developed nations where the importance of digital literacy has been emphasized by political,
economic, and educational institutions. Indeed, both countries have competencies related to digital literacy or instruction with
technology featured in their standards for teacher education (CAEP, 2018; OECD, 2015; Council of the European Union, 2018). At the
same time, these two settings differ in the school contexts within which digital literacy may be expected develop. In particular, these
settings differ in both their curricula and in their assessment practices with regard to digital literacy. While Sweden has a central
national curriculum, emphasizing digital literacy, curricula in the United States are localized, with different states and districts
differentially defining and emphasizing digital literacy in their instructional standards (Porter, McMaken, Hwang, & Yang, 2011;
Schmidt, Whang, & McNight, 2005).
Second, while teachers in Sweden have been found to be quite concerned with students’ performance on standardized assessments
of digital literacy (Brante & Stang Lund, 2017), standardized assessments in the US have typically not tapped digital literacy (Davis
et al., 2018). In particular, secondary students in Sweden are required to compose argumentative essays based on information pre
sented across multiple texts as a part of their standardized assessment process. In the United States, standardized assessments are much
more likely to target domain-general reading comprehension skills, manifest in relation to a single text. Research on digital literacy in
Sweden has mainly followed two tracks; how digital technology is implemented in classrooms (e.g. Andersson & Sofkova-Hashemi,
2016; Bergdahl, Fors, Hernwall, & Knutsson, 2018; Sofkova Hashemi & Cederlund, 2017) or how students search and evaluate in
formation on the Internet (Nygren & Guath, 2019; Sundin, 2015). Research concerning conceptions of digital literacy among
pre-service teachers are not, to the best of our knowledge, available.
More generally, cross-cultural and cross-national comparisons have long been a tradition in beliefs research, providing key insights
into the universality of conceptions of knowledge, information, and now digital literacy (List, Peterson, Alexander, & Loyens, 2018;
Magionni, Riconscente, & Alexander, 2006; Muis & Sinatra, 2008). Examining conceptions across national settings that are similar yet
distinct allows for their nature to be better understood. That is, it allows us to consider what aspects of conceptions of digital literacy
may be commonly held among pre-service teachers in economically developed nations versus which aspects may be distinct. However,
recent work has only begun to examine digital-literacy-related constructs comparatively (e.g., Cai & Gut, 2018; Davis et al., 2018; Noh,
3
A. List et al. Computers & Education 148 (2020) 103788
2019). In this initial investigation, we seek to identify some emergent similarities and differences in pre-service teachers’ digital lit
eracy conceptions across two national settings. Our hope is that future investigations will examine such differences further and expand
the analyses in this study to other national settings, similarly concerned with issues of digital literacy. In this study, we compare
conceptions of digital literacy in the United States and in Sweden as a way of further examining not only the universality of such
conceptions but also their manifestation within particular cultural, educational, and national contexts. We do this by collecting both
quantitative and qualitative data of pre-service teachers’ conceptions. First, pre-service teachers were asked to define their conceptions
of digital literacy in an open-ended fashion. Then, students were asked to select those elements that they considered to be most
essential for digital literacy, from a set of 24 options developed based on frameworks of digital literacy introduced in prior work (e.g.,
Ng, 2012). Thus, we focus on the following research questions:
1. What are pre-service teachers’ conceptions of digital literacy in the United States?
2. What are pre-service teachers’ conceptions of digital literacy in Sweden?
3. How do conceptions of digital literacy differ in the United States vis-a
�-vis Sweden?
2. Methods
This study employed an exploratory mixed methods design. This mixed-method approach allowed us to use qualitative results to
inform our quantitative analysis (Creswell & Plano Clark, 2011). Such a design is especially useful when a clear theoretical framework
does not yet exist (Creswell & Plano Clark, 2011), as was the case in this study, as pre-service teachers’ conceptions of digital literacy
have not yet been examined in prior work. We began with the coding of participants’ open-ended definitions to get a sense of
pre-service teachers’ conceptions of digital literacy. Then, based on the themes emerging from the qualitative coding, we chose items
from the quantitative portion of our survey to cluster into profiles of digital literacy conceptions. All data were collected at one time
point in an electronic survey via Qualtrics (in the United States) and Google Forms (in Sweden). A comparison of contexts, samples, and
methods across the two countries can be found in Table 1.
In the United States, pre-service teacher preparation includes a bachelor’s degree in elementary, middle grades, or secondary
education, typically earned over four years. During those four years, per-service teachers complete general education courses (e.g.,
educational psychology), methods classes (e.g., reading instruction), and a student-teaching practicum, during which they are placed
in classrooms and supervised by a mentor teacher. Pre-service teachers pursuing secondary certification typically also complete
content and methods courses (e.g., biology and biology instruction) in their subject area of interest.
Table 1
Comparison of samples and data collection across countries.
United States Sweden
Context
Pre-Service Bachelor’s degree in elementary, middle grade, or secondary 3.5 years for kindergarten-teacher; 4 years for 1st-6th grade; 4.5–5.5
Teacher education, typically earned over four years years for grades 7-12
Preparation
Preparation General education courses, methods classes, and a student-teaching Subject studies, educational science, and a placement in schools
Program Focus practicum
Guiding Standards Council for the Accreditation of Educator Preparation: “model and Higher Education Ordinances: “be able to demonstrate the ability to
apply technology standards as they design, implement and assess safely and critically use digital tools in the educational setting and to
learning experiences to engage students and improve learning.” take into account the role of different media and digital
environments in educational settings.”
Sample
N 188 121
Sampling Method Convenience, Voluntary Convenience, Voluntary
Sampling Frame Those enrolled in education majors at a large, public university in Those enrolled in pre-service training at four different universities
the United States in Sweden
Data Collection Electronic survey in Qualtrics administered in a lab environment Electronic survey in Google Forms, sent to participants to complete
Method at a time and location of their choosing
Time Point Fall 2017 Spring 2018
Methods
Survey Language English Swedish, with responses translated into English by researcher
Open-Ended Recently, there has been a push to teach students digital literacy or to develop students’ 21st century literacy skills. How do you define
Question digital literacy? What skills do you consider to be necessary for digital literacy?
Selected Response Below we list different skills that have been associated with digital Below we list different skills that have been associated with digital
Instructions literacy. Please select the five (5) skills you consider to be most literacy. Please select the six (6) skills you consider to be most
critical for digital literacy. critical for digital literacy.
4
A. List et al. Computers & Education 148 (2020) 103788
To some extent, instruction in digital literacy is well-represented in teacher education programs in the United States. Specifically,
the Council for the Accreditation of Educator Preparation (CAEP) lists among its standards for teacher preparation programs that
teachers be able to: “model and apply technology standards as they design, implement and assess learning experiences to engage
students and improve learning.” Nevertheless, being able to use technology as a part of instruction does not ensure that teachers
develop well-formed conceptions of digital literacy and understand how it may be developed and assessed in their students (Simard &
Karsenti, 2016).
Participants were 188 undergraduate students at a large, public university in the United States. This was a convenience sample with
voluntary participation. Students cannot be considered to be representative of pre-service teachers in the United States. Students, 18
and older, enrolled in an educational psychology course that constitutes an introductory class for the teacher education program were
invited to participate for extra credit. In the sample, 38.30% (n ¼ 72) were interested in teaching at the early elementary level (i.e., Pre-
Kindergarten to 2nd grade), 31.91% (n ¼ 60) were interested in teaching elementary school (i.e., 3rd-5th grade), 15.43% (n ¼ 29) of
students wanted to teach middle school (i.e., 6th-8th grade), while 31.91% of the sample was interested in teaching high school (n ¼
60, 9th-12th grade). As a note, these percentages do not add up to 100% as students could indicate that they were interested in teaching
more than one grade level. Some students (n ¼ 9) further specified their grade preference as other, indicating that they were interested
in teaching special education, English as a second language, or post-secondary students. Data were collected in Fall 2017, approxi
mately two months into the semester. Data were collected as part of a broader study examining students’ conceptions of digital literacy
(List, 2019) and multiple text task performance, a task associated with digital literacy; however, data are uniquely presented in this
study.
Swedish teacher education differs in length, depending on which age group pre-service teachers are aiming to teach. To become a
kindergarten-teacher, students study for 3.5 years; to teach up to 6th grade, students study for four years. To teach grades 7–12,
students need to study for 4.5–5.5 years, depending on how many subjects they intend to teach. Swedish teacher education programs
consist of three interwoven parts: one is subject studies, another is educational science (e.g. knowledge about the teaching profession
and schools as social, pedagogical, and didactic environments), and the third is placement in schools, where pre-service teachers
develop skills in planning and delivering lessons, classroom management, and relationship building with students. Standards for
teacher education programs, found in the Higher Education Ordinances (SFS, 2018:1053), address digital literacy by asking students to
“be able to demonstrate the ability to safely and critically use digital tools in the educational setting and to take into account the role of
different media and digital environments in educational settings.”
Participants were 121 pre-service teachers at four different universities in Sweden. This was also a convenience sample with
voluntary participation and was not representative of Sweden, as a whole. Invitations to the survey were sent to contacts at the
universities with a request that it be sent out to pre-service teachers in their first semester of study. Invitations to participate were
distributed via email and via the learning management systems used by various universities. Among participants, 38.01% (n ¼ 46)
planned to teach Kindergarten, 15.70% (n ¼ 19) intended to teach in primary school, 28.10% (n ¼ 34) in middle school, 8.26% (n ¼
10) in school-years 7-9, and 8.26% (n ¼ 10) in school years 10–12. One participant (0.83%) was not sure about the grades they
intended to teach and one participant (0.83%) reported an intent to teach another age group. Data were collected in Spring 2018.
Participants had studied for approximately four weeks in their teacher education programs prior to responding to the survey.
2.4. Measures
Participants in the U.S. and Sweden were asked to complete parallel measures of their conceptions of digital literacy. Conceptions
of digital literacy were assessed using open-ended and selected-response questions. All questions were asked in an electronic survey
with participants typing their responses to the open-ended question and selecting their responses from a list for the selected-response
question.
2.4.1.1. Coding process. While using a top-down coding scheme based on prior research (e.g., Eshet-Alkalai, 2004; Ng, 2012) was
initially attempted, pre-service teachers’ definitions of digital literacy were found to be more limited than those populating the
literature. Therefore, a bottom-up coding scheme was adopted. Conceptions of digital literacy were coded through a four-step process.
Initially, the first and second authors of the paper independently reviewed participants’ definitions of digital literacy to identify any
5
A. List et al. Computers & Education 148 (2020) 103788
patterns or categories to emerge. The authors then discussed these patterns and, together, categorized responses according to these
patterns. Once categories could be consistently identified in pre-service teachers’ responses, authors created working definitions for
each of these categories and identified prototypical examples. Third, authors independently coded responses for a second time, placing
them into the common categories established. Finally, authors again came together to corroborate their coding and to establish
inter-rater consistency. This four-step process was carried out first for participants in the United States and then for those in Sweden,
with consistent categories identified across these two settings. Raters coded the open-ended responses first, separately from examining
the selected-response, skill-preference question.
2.4.1.2. Coding scheme. Pre-service teachers’ open-ended conceptions of digital literacy were placed into one of four categories,
considered to progress in sophistication (see Fig. 1). Specifically, participants’ open-ended definitions were coded as reflecting
technology-focused, digital reading, goal-directed, and critical use-based conceptions of digital literacy. These categories were created
to be mutually exclusive, such that responses were placed into one and only one category of digital literacy conceptions. Additionally,
these categories were created to be exhaustive, with as many responses as possible classified into these four categories.
Coding categories and sample responses are presented in Table 2. For the U.S. sample, Cohen’s kappa inter-rater agreement was
0.81, based on 40 student responses (21.28% of the sample; exact agreement: 85.00%). For the Swedish sample, two raters coded 36
student responses (29.75% of the sample), with Cohen’s kappa inter-rater agreement equal to 79.17%, indicating strong agreement
(exact agreement: 83.33%). In instances when responses could conceivably be placed into more than one category, the more so
phisticated category was selected. For instance a response such as: Digital literacy is reading, writing, and comprehending through materials
on a digital screen (for example, computers, television, cellular device, tablet.) For digital literacy, someone needs to be competent in regular
literacy and also have knowledge about technologies, could be placed either into the digital reading category or into the technology-
focused category because of its insistence that digital literacy requires “knowledge about technologies.” In such an instance, the
more sophisticated digital reading category was selected. At times, students identified skills not explicitly mapping onto our digital
literacy framework. For instance, one student defined digital literacy as: The skills necessary for digital literacy are being capable of using
technology and being proficient in its use. Also, being able to communicate through technology, with “communicating through technology”
constituting a unique response and one not expressly included among the conceptions of digital literacy that we identified. Never
theless, in this case, the response was placed into the technology-focused category because of its focus on the role of technology in
facilitating communication. More generally, the decision approach we adopted in classifying response with unique skills identified was
determining the category of best-fit.
6
A. List et al. Computers & Education 148 (2020) 103788
Table 2
Summary of coding categories for open-ended responses.
Definition Example
Technology An understanding of digital literacy that is technology driven and I consider digital literacy to be having an understanding of technology
Focused focused on mastering specific technological tools (e.g., computers, and how it works. The skills I consider necessary would be typing, using
Internet use). a computer like programs such as Google and Microsoft Word, and
being able to use an iPhone correctly.
Digital An understanding of digital literacy that is focused on the I would define digital literacy as readings and information being
Reading translation of traditional print literacy to digital contexts and all that available online. You can not only access one piece of information
entails. online, you can access multiple sources all at once. Skills I think are
necessary are knowing how to use and navigate a computer, Reading
skills, and focus.
Goal Directed An understanding of digital literacy that is focused on using digital … I would describe it as figuring out ideas and concepts that you need to
tools to accomplish specific tasks. through a more digital use and reading what you need in order to
complete that.
Critical Use An understanding of digital literacy that sees it as the reflective and I would define digital literacy as the ability to have an understanding
evaluative process of using technology and reading digitally, to and the ability to be digitally savvy, whether that be knowing what
accomplish task goals. technological resource to use and when or understanding the
implications of the digital age. I consider an open mind and maturity
necessary to be necessary for digital literacy.
Table 3
Categories of open-ended responses by country.
United States Sweden
n % n %
Table 4
Endorsement of selected-response items by country.
United States Sweden
n % n %
Note.
a
Significant differences within country when comparing to proportion of item selected by chance.
b
Significant differences between country on proportion of item selection.
7
A. List et al. Computers & Education 148 (2020) 103788
as possible. This resulted in a final 24-item list of digital literacy skills identified. Skill descriptions were then phrased to be both
maximally inclusive and distinct from one another. Participants were given the opportunity to write-in any skills they considered to be
essential, when completing this measure; however, none added any skills as essential for digital literacy. This was the case across both
national settings.
Data for this study were collected as part of a larger examination of pre-service teachers’ multiple text use in the digital age but are
uniquely presented in this manuscript. These analyses are based on pre-service teachers’ responses to a researcher-designed survey of
digital literacy. Participants completed the digital literacy survey in a computer lab, with a researcher present, using the Qualtrics
survey platform. Responses were typed by participants within the electronic survey platform.
Data analysis procedures were identical for both samples. We began with the qualitative portion of our analysis, coding the open-
ended responses via the four-step process outlined above. Coding was done manually with the use of Excel for tracking. Pre-service
teachers’ open-ended conceptions of digital literacy were coded into one of four categories (i.e., technology-focused, digital
reading, goal directed, critical use), with responses unable to be otherwise coded placed into the other category.
We then quantitatively examined pre-service teachers’ selections of the skills or competencies that they considered to be essential
for digital literacy. First, we examined the frequency with which participants selected various skills as essential for digital literacy.
8
A. List et al. Computers & Education 148 (2020) 103788
Second, cluster analysis was used to see whether patterns in participants’ digital literacy conceptions could be determined based on a
subset of skills selected. Next, a subset of digital literacy skills, of the 24 that participants had to choose from, were selected for use in a
cluster analysis. The goal of the cluster analysis was to identify patterns in pre-service teachers’ skill selections and examine the extent
to which these mapped on to conceptions of digital literacy identified based on their open-ended responses. Due to limitations in
sample size, all twenty-four skills could not be included in our cluster analysis. Rather, four skills were chosen as predictors for
clustering (i.e., “using a variety of technology tools; ” “using and understanding multimedia; ” “gathering information to make de
cisions and learn about topics of interest; ” and “critically evaluating source trustworthiness”). These four skills were selected by
researchers based on their alignment with the four conceptions of digital literacy identified in participants’ open-ended responses.
A two-step cluster analysis was performed with pre-service teachers’ selection of these four skills, or not, used as binary predictor
variables. Two-step clustering was selected as the preferred clustering algorithm for three primary reasons. First, two-step clustering
has been described as an effective clustering method due to its integration of hierarchical and partition-based approaches to clustering
(Sarstedt & Mooi, 2014, pp. 273–324). Second, two-step cluster analysis did not require a specific number of clusters to be designated
a-priori (Noru�sis, 2011); rather, the optimal number of clusters was determined based on the Bayesian Information Criteria (BIC). The
exploratory nature of this study resulted in our preferring this statistical basis for cluster determination. Finally, two-step clustering is
able to use both continuous and categorical variables as predictors (Noru�sis, 2011), the latter of which were used in this study. Due to
the binary nature of the four predictors used in our analyses, clusters were determined based on the log-likelihood distance.
Finally, we examined the relation between categories of digital literacy conceptions, determined based on participants’ open-ended
responses, and cluster membership, determined based on participants’ answers to the selected response question. A chi-squared test of
association was used to determine whether the categories that pre-service teachers were placed into, based on their open-ended re
sponses, corresponded to their cluster membership. All quantitative analysis was performed in SPSS 23.0.
3. Results
Conceptions of digital literacy, gathered via open-ended and selected response questions, were first examined independently within
each national setting (RQ1 and RQ2), and then compared across contexts (RQ3).
3.1. RQ1: What are pre-service teachers’ conceptions of digital literacy in the United States?
For the first research question, we examined U.S. pre-service teachers’ open-ended and selected response conceptions of digital
literacy.
3.1.1. Open-ended
See Table 3 for the frequency of pre-service teachers placed into each category. A chi-squared goodness of fit test determined
membership to be disproportionately distributed across categories [X2 (4, n ¼ 188) ¼ 76.04, p < .01, Cramer’s V ¼ 0.32].
3.1.2. Selected-response
Selected-responses were analyzed in terms of frequency of selection, profiles of selection, and the association between open-ended
and selected-response profiles.
3.1.2.1. Frequency of skill selection. See Table 4 for the frequency with which pre-service teachers selected each item. Four items were
chosen by more than 30% of respondents as essential for digital literacy. A chi-squared goodness of fit test was used to determine
whether participants selected each of the 24 available skills to a disproportionate extent. As participants in the U.S. were asked to select
5 out of 24 skills, each skill had a 20.83% chance of being selected at random. We were interested in whether any skills were
disproportionately over or under selected, relative to chance (i.e., 20.83%). Nine skills were found to be disproportionately selected as
essential for digital literacy, based on a Bonferroni-adjusted alpha of .002. These skills are indicated in Table 4 via an a.
Table 5
Centroids for selected-response clusters in the U.S. Sample.
Cluster 3 Cluster 1 Cluster 4 Cluster 5 Cluster 2
21.81% (n ¼ 41) 15.96% (n ¼ 30) 25.53% (n ¼ 48) 20.74% (n ¼ 39) 15.96% (n ¼ 30)
Using a variety of technology tools 1.00 (0.00) 0.00 (0.00) 0.44 (0.50) 0.31 (0.47) 0.00 (0.00)
Using and understanding multimedia 0.42 (0.50) 1.00 (0.00) 0.13 (0.33) 0.03 (0.16) 0.00 (0.00)
Gathering information to make decisions 0.00 (0.00) 0.00 (0.00) 1.00 (0.00) 0.00 (0.00) 0.00 (0.00)
Critically evaluating source trustworthiness 0.00 (0.00) 0.30 (0.47) 0.23 (0.42) 1.00 (0.00) 0.00 (0.00)
Note: Clusters ordered by progression of conceptualizations. For validation, a series of chi-squared tests were performed to ensure that the clusters
identified varied in their endorsement of the target skills examined as essential for digital literacy. As expected, the five clusters identified were found
to differ in their selection of items reflecting using a variety of technology tools [X2(4, n ¼ 188) ¼ 103.70, p < .01, Cramer’s V ¼ 0.74], using and
understanding multimedia [X2(4, n ¼ 188) ¼ 108.99, p < .01, Cramer’s V ¼ 0.76], gathering information to make decisions [X2(4, n ¼ 188) ¼ 188.00,
p < .01, Cramer’s V ¼ 1.00], and critically evaluating sources [X2(4, n ¼ 188) ¼ 119.37, p < .01, Cramer’s V ¼ 0.80], as essential for digital literacy.
9
A. List et al. Computers & Education 148 (2020) 103788
3.1.2.2. Cluster analysis. Based on the four binary predictors, five clusters were identified and found to have a good fit to the data
(Silhouette measure of cohesion ¼ 0.60). See Table 5 for cluster centroids. Cluster one consisted of pre-service teachers who pre
dominantly selected the item “using and understanding multimedia” as essential for digital literacy (15.96%, n ¼ 30). This cluster was
labeled Digital Reading. Cluster two, labeled Limited (15.96%, n ¼ 30), reflected pre-service teachers not selecting any of the four skills
examined as essential for digital literacy. Cluster three consisted of those selecting the item “using a variety of technology tools” as
essential for digital literacy (21.81%, n ¼ 41). This cluster was labeled Technology Focused. Cluster four reflected pre-service teachers
who selected the item “gathering information to make decisions and learn about topics of interest” (25.53%, n ¼ 48) and was labeled
Goal-Directed. Finally, cluster five included those who selected the item “critically evaluating source trustworthiness” (20.74%, n ¼ 39)
as essential for digital literacy. This cluster was labeled Critical Use. Clusters were validated using a series of chi-squared tests. See
Table 5.
3.1.2.3. Open-ended categories and selected response clusters. A significant association was determined [X2 (16, n ¼ 188) ¼ 39.03, p <
.01, Cramer’s V ¼ 0.23], indicating that cluster membership was associated with categories identified from pre-service teachers’ open-
ended responses. Based on an examination of standardized residuals, members of the Critical Use cluster were significantly over
represented in the critical use category (6.38%, n ¼ 12) and underrepresented in the digital reading category (1.06%, n ¼ 2). Members of
the Technology Focused cluster were significantly underrepresented in the critical use category (0.00%, n ¼ 0). See Table 6 for a
summary of category and cluster membership.
3.2. RQ2: What are pre-service teachers’ conceptions of digital literacy in Sweden?
For the second research question, we examined Swedish pre-service teachers’ conceptions of digital literacy based on their answers
to open-ended and selected-response questions.
3.2.1. Open-ended
See Table 3 and Fig. 2 for frequencies of category membership. Pre-service teachers in Sweden most commonly conceptualized
digital literacy as technology focused (35.50%, n ¼ 43), while the next most populated category reflected those with goal-directed
(27.30%, n ¼ 33) conception of digital literacy. However, a substantial number of pre-service teachers in Sweden composed responses
that, unable to be otherwise coded, were placed into the other category (19.00%, n ¼ 23). A chi-squared goodness of fit test determined
participants’ category membership to be disproportionately distributed across categories [X2 (4, n ¼ 121) ¼ 38.96, p < .01, Cramer’s V
¼ 0.28].
3.2.2. Selected-response
Selected-responses were again analyzed in terms of frequency of selection, profiles of selection, and the association between open-
ended and selected-response profiles.
3.2.2.1. Frequency of skill selection. See Table 4 for the frequency with which each selected response item was endorsed. Two items
were chosen by a substantial majority of respondents. These were items stating that “searching for and selecting relevant information”
(81.82%, n ¼ 99) and “critically evaluating source trustworthiness” (72.73%, n ¼ 88) were essential for digital literacy.
A chi-squared goodness of fit test was used to determine whether pre-service teachers selected any of the 24 available skills to a
disproportionate extent. As participants in Sweden were asked to select 6 out of 24 skills as essential for digital literacy, each skill had a
25.00% chance of being selected at random. We were interested in whether any skills were disproportionately over or under selected,
relative to the 25.00% threshold. Six items were found to be disproportionately endorsed by pre-service teachers, based on a
Bonferonni-corrected alpha of .002. These skills are indicated in Table 4 via an a.
3.2.2.2. Cluster analysis. Cluster analysis found a five-cluster solution to be a good fit to the Swedish data (Silhouette measure of
cohesion ¼ 0.70). See Table 7 for cluster centroids. Cluster one included pre-service teachers who selected the item “critically eval
uating source trustworthiness” (35.54%, n ¼ 43) as essential for digital literacy. This cluster was labeled Critical Use. Cluster two
Table 6
Frequency of selected-response clusters by open-ended categories for U.S. Sample.
Cluster 3 Cluster 1 Cluster 4 Cluster 5 Cluster 2
21.81% (n ¼ 41) 15.96% (n ¼ 30) 25.53% (n ¼ 48) 20.74% (n ¼ 39) 15.96% (n ¼ 30)
10
A. List et al. Computers & Education 148 (2020) 103788
reflected pre-service teachers selecting the item “using a variety of technology tools” as essential for digital literacy (17.36%, n ¼ 21)
and was labeled Technology Focused. However, those in this cluster also endorsed the item “critically evaluating source trustworthi
ness” as essential for digital literacy. Although those in the Technology Focused cluster endorsed two items as essential for digital
literacy, we favored the technology-focused explanation in naming this cluster as it was the only cluster in which all participants chose
the technology focused skill. This decision was further made because we considered pre-service teachers’ very high endorsement of the
item “critically evaluating source trustworthiness” as essential for digital literacy to partially be a statistical artifact attributable to its
order of presentation (i.e., first).
Cluster three consisted of pre-service teachers who endorsed the item “using and understanding multimedia” as essential for digital
literacy (19.01%, n ¼ 23). This cluster was labeled Digital Reading. Cluster four was labeled as Limited (14.05%, n ¼ 17), for the most
part consisting of those not endorsing any of the four target skills as essential for digital literacy, however it did contain seven par
ticipants (41.18%) that chose the Technology-Focused skill. We decided to label it Limited for consistency. Cluster five reflected pre-
service teachers who selected the item “gathering information to make decisions and learn about topics of interest” (14.05%, n ¼
17) as an essential competency for digital literacy. This item was labeled Goal-Directed. Clusters were again validated using a chi-
squared test of association. See Table 7.
3.2.2.3. Open-ended categories and selected response clusters. In contrast to results for the U.S. sample, there was not a significant
association between pre-service teachers’ conceptions of digital literacy category membership, determined based on open-ended re
sponses and cluster membership [X2 (16, n ¼ 121) ¼ 15.18, p ¼ .51]. See Table 8 for a summary of category and cluster membership.
3.3. RQ3: How do conceptions of digital literacy differ in the U.S. Versus Sweden?
To answer the third research question, we were interested in comparing conceptions of digital literacy in the United States vis-a-vis
Sweden. We first compared pre-service teachers’ conceptions of digital literacy as captured via the open-ended question, then as
reflected in their skill selections as captured by the selected response question, and finally, as determined via cluster analysis.
Table 7
Centroids for selected-response clusters in Swedish sample.
Cluster 2 Cluster 3 Cluster 5 Cluster 1 Cluster 4
17.36% (n ¼ 21) 19.01% (n ¼ 23) 14.05% (n ¼ 17) 35.54% (n ¼ 43) 14.05% (n ¼ 17)
Using a variety of technology tools 1.00 (0.00) 0.39 (0.50) 0.35 (0.49) 0.00 (0.00) 0.41 (0.51)
Using and understanding multimedia 0.00 (0.00) 1.00 (0.00) 0.18 (0.39) 0.00 (0.00) 0.00 (0.00)
Gathering information to make decisions 0.00 (0.00) 0.00 (0.00) 1.00 (0.00) 0.00 (0.00) 0.00 (0.00)
Critically evaluating source trustworthiness 1.00 (0.00) 0.65 (0.49) 0.53 (0.51) 1.00 (0.00) 0.00 (0.00)
Note: Clusters ordered by progression of conceptualizations. We validated our cluster solution by testing whether students in each of the five clusters
identified varied according to their endorsement of the four skills examined as essential for digital literacy. As anticipated, chi-squared analyses
determined that, indeed, the clusters differed in their endorsement of items reflecting using a variety of technology tools [X2(4, n ¼ 121) ¼ 62.16, p <
.01, Cramer’s V ¼ 0.72], using and understanding multimedia [X2(4, n ¼ 121) ¼ 106.36, p < .01, Cramer’s V ¼ 0.94], gathering information to make
decisions [X2(4, n ¼ 121) ¼ 121.00, p < .01, Cramer’s V ¼ 1.00], and critically evaluating source trustworthiness [X2(4, n ¼ 121) ¼ 73.34, p < .01, p
< .01, Cramer’s V ¼ 0.78], as essential for digital literacy.
11
A. List et al. Computers & Education 148 (2020) 103788
Table 8
Frequency of selected-response clusters by open-ended categories for Swedish Sample.
Cluster 3 Cluster 1 Cluster 4 Cluster 5 Cluster 2
Technology Focused [& Critical] Digital Reading Goal Directed Critical Use Limited [& Technology]
17.36% (n ¼ 21) 19.01% (n ¼ 23) 14.05% (n ¼ 17) 35.54% (n ¼ 43) 14.05% (n ¼ 17)
defining digital literacy as technology focused, digital reading, goal-directed, or as requiring critical use, as well as an other category. A
chi-squared test of association was used to determine whether the prevalence of these categories was consistent across national set
tings. The chi-squared was significant [X2 (4, n ¼ 309) ¼ 55.36, p < .01, Cramer’s V ¼ 0.42], indicating differences in the prevalence of
category membership across the two countries. Based on an examination of standardized residuals, participants in the U.S. were
significantly overrepresented in the Digital Reading category (28.19%, n ¼ 53) whereas participants in Sweden were underrepresented
in that category (1.70%, n ¼ 2). In contrast, respondents in Sweden were significantly overrepresented in the Other category (19.00%,
n ¼ 23) whereas respondents in the United States were underrepresented in that category (2.66%, n ¼ 5). See Table 3 for the prevalence
of open-ended category membership in the United States and Sweden.
4. Discussion
The aim of the present study was to examine U.S. and Swedish pre-service teachers’ conceptions of digital literacy through open-
ended and selected response survey questions and to establish whether there were any differences in conceptions of digital literacy
across the two contexts. Drawing on pre-service teachers’ open-ended responses, we were able to identify four clearly discernible
categories (technology focused, digital reading, goal-directed, and critical use) reflecting distinct conceptions of digital literacy. These four
categories were found to manifest across both the U.S. and Swedish samples. Moreover, these categories were found to be conceptually
consistent with patterns in pre-service teachers’ selected responses asking them to identify the skills or competencies that they consider
to be essential for digital literacy, as determined via cluster analysis. Looking across open-ended and selected responses, at least four
conclusions may be drawn.
First, each of the four conceptions identified reflected a definition of digital literacy found in prior research, to varying extents. A
technology-focused conception of digital literacy was consistent with the construct of digital propensity common in the literature on ICT
Table 9
Frequency of selected-response cluster membership by country.
Technology Focused Digital Reading Goal Directed Critical Use Limited
United States 41 30 48 39 30
21.81% 15.96% 25.53% 20.74% 15.96%
Sweden 21 23 17 43 17
17.36% 19.01% 14.05% 35.54% 14.05%
12
A. List et al. Computers & Education 148 (2020) 103788
literacy (Nasah, DaCosta, Kinssell, & Seok, 2010). Digital propensity emphasizes that digital literacy is the result of access to and use of
technology and is often assessed via students’ reports of their technology use (Margaryan, Littlejohn, & Vojt, 2011; Thompson, 2013).
Digital reading, while less emphasized in literature-based definitions of digital literacy, has been examined in work comparing stu
dents’ reading and strategy use when presented with texts digitally or in print (Brown, 2001; Mangen, Walgermo, & Brønnick, 2013;
Peterson & Alexander, 2020; Singer & Alexander, 2017a; 2017b). Given the notable differences that emerge, pre-service teachers’
focus on digital reading as a component of digital literacy seems well-grounded.
Goal-directed conceptions of digital literacy are well-represented in the literature. Indeed, a number of process models have been
introduced (Brand-Gruwel, Wopereis, & Walraven, 2009; Rouet & Britt, 2011) viewing digital literacy as reflective of the process of
information problem solving or of resolving task goals via technology and information use on the Internet (Mills, 2006). Finally,
conceptions of digital literacy as requiring critical use are those most commonly represented in the research literature as well as in
policy documents setting out standards for digital literacy (Bawden, 2008; Coiro, 2003; Koltay, 2011; OECD, 2015).
Comparing pre-service teachers’ open-ended conceptions of digital literacy to frameworks specified in the literature (Eshet-Alkalai,
2004; Ng, 2012) reveals several points of overlap as well as distinction. First, both Eshet-Alkalai (2004) and Ng (2012) recognize
competence with technology to be a necessary, yet insufficient, component of digital literacy. Our participants commonly defined
digital literacy only as synonymous with fluent technology use, as reflected in technology focused conceptions of digital literacy. Second,
Eshet-Alkalai (2004) introduces notions of branching literacy and photo-visual literacy, seemingly reflected in pre-service teachers’
digital reading-aligned conceptions of digital literacy. Both of these literacies recognize the characteristics of reading on the Internet
that make it a distinct process for learners, as compared to print-based reading. These characteristics include the hyperlinked or
interconnected nature of information online (i.e., branching literacy) and its simultaneous presentation via a variety of modalities (i.e.,
photo-visual literacy). Third, Eshet-Alkalai (2004) introduces the notion of information literacy as including students’ abilities to
analyze and critically evaluate information on the Internet. Such analysis and evaluation-focused conceptions of digital literacy are
reflected in the critical use category identified in pre-service teachers’ open-ended responses.
Interestingly, the ability to satisfy task demands, consistent with the goal-directed category of digital literacy conceptions, is absent
from many frameworks of digital literacy (Bawden, 2008; Eshet-Alkalai, 2004; Ng, 2012). Instead, frameworks of digital literacy define
the range of skills or competencies that students need to master in order to be able to accomplish task goals. Nevertheless, the
prevalence of goal-directed conceptions of digital literacy across both the U.S. and Swedish samples suggests the need for existing
digital literacy frameworks to better incorporate the purpose or goals of students’ skill use into their definitions of digital literacy.
Across the four profiles identified, pre-service teachers’ conceptions of digital literacy were dominated by a focus on the fluent use
of technology. Of course, this technological focus is likely reflective of participants’ experiences growing up in a digital society
(Conole, De Laat, Dillon, & Darby, 2008; Kennedy, Judd, Dalgardno, & Waycott, 2010; Thompson, 2013). Nevertheless, there are at
least two further explanations for the prominence of technology use in pre-service teachers’ understandings of digital literacy. The
first, rather self-evident, explanation concerns the difference between asking participants to define digital vis-a-vis “traditional” lit
eracy. When thinking about “traditional” literacy, reading-related behaviors are central. Whereas when asked to define digital literacy,
pre-service teachers may focus on what makes this type of literacy distinct from its more traditional counterpart. For pre-service
teachers, this distinction between digital and traditional literacy may, reasonably, be the savvy and fluent use of technology. How
ever, participants’ conceptions of technology use, within the context of digital literacy, were found to be largely superficial in nature
and largely lacking in the recognition of the Internet as a unique, epistemic environment (Tsai, 2004).
Another explanation for the technology focus when defining digital literacy is the concrete nature of technology. The fluent use of
13
A. List et al. Computers & Education 148 (2020) 103788
technology is easier to recognize and assess than is engagement in critical reasoning. In other words, whereas making a video clip, as a
technological facet of digital literacy, can be demonstrated with ease, it is more challenging to determine whether the critical eval
uation of information has taken place. Consequently, the concrete or behavioral nature of technology use seems to move it to the
forefront of many pre-service teachers’ definitions. Although technology use, doubtlessly, is an important component of digital lit
eracy, both policy-based (e.g., OECD, EU) and research-based definitions of digital literacy emphasize that skills like critical thinking,
source evaluation, and integration are needed as well (OECD, 2015; Visser, 2013). Indeed, these definitions suggest that a sophisticated
conception of digital literacy is one that is marked not only by the fluent, but also by the critical use, of technology. As such, digital
literacy may be expected to represent not only a skillset but also a collection of dispositions or critical attitudes toward information.
In addition to identifying four profiles among pre-service teachers’ conceptions of digital literacy, we further argue that these
profiles can be conceptualized as distinct from one another and as progressing in sophistication. In particular, we may expect pre-
service teachers becoming more sophisticated in their conceptions of digital literacy to move from focusing on technology use as
necessary for digital literacy to recognizing that the critical use of technology is essential as well. Indeed, as may be expected of digital
literacy conceptions progressing in sophistication, comparatively fewer pre-service teachers in our sample understood digital literacy
as requiring the critical use of technology (14% of total sample), as compared to those defining digital literacy only as technology
focused (38% of total sample). This was the case across both the United States and Sweden. This theorized progression in sophistication
adds further nuance to the development of professional competence related to digital literacy introduced by Krumsvik (2008; 2011;
2014). In particular, Krumsvik (2008) in defining teachers’ professional digital competence introduces a Model of Digital Competence
For Teachers and Teacher Educators. Within this framework, Krumsvik suggests that as teachers grow in their professional compe
tence, they enhance their basic ICT skills and didactic ICT competence, come to understand their strategies for technological learning,
and develop digital bildung (i.e., a meta-awareness of the connection between their own digital competence and societal digitaliza
tion). Like Krumsvik (2008; 2011), we adopt a progressive or development-focused view on pre-service teachers’ digital literacy and
specifically address how pre-service teachers’ conceptions of what these include may develop through the course of their education.
Moreover, the four profiles identified can be understood as componential in nature. In other words, pre-service teachers whose
open-ended responses were placed into the critical use category, considered the most sophisticated of the four definitions of digital
literacy, were expected to conceptualize digital literacy in terms of its less sophisticated components (e.g., technology use or digital
reading). This was exemplified in responses such as: “Digital literacy would be the process of finding and understanding digital
literature to obtain factual knowledge or evidence about a topic. To be digitally literate requires proper search skills and an under
standing of acceptable sources.” In this response is reflected the simultaneous recognition that digital literacy involves reading that is
digital (i.e., digital literature), goal-directed (i.e., process of finding … to obtain factual knowledge), and critically-evaluative (i.e., an
understanding of acceptable sources) in nature. This is further consistent with an understanding of digital literacy as a set of inter-
dependent or inter-related competencies (Eshet-Alkalai, 2012, 2004; Ng, 2012), rather than as discrete skills able to identified and
isolated.
Beyond placing pre-service teachers into the four digital literacy profiles identified, we were also interested in the relative prev
alence of pre-service teachers’ profile membership across the two national settings (i.e., the United States and Sweden).
14
A. List et al. Computers & Education 148 (2020) 103788
Swedish sample suggesting that digital literacy required knowledge or skills in programming. Such definitions were largely absent
from the U.S. sample.
Another difference to emerge across national settings was the large number of participants placed into the “other” category, when
defining digital literacy, in the Swedish sample. For the most part, pre-service teachers placed into the “other” category were non-
responsive (e.g., writing “I don’t know” or “Knowledge, know how to teach”). We attributed this difference in response quality to
differences in the manner in which surveys were administered. Given that Swedish participants completed the surveys independently
and online, a lower degree of response quality may be understandable.
This study was carried out to inform the developing literature on what teachers in the digital age need to know for digital
competence (Krumsvik, 2008; 2014). While prior work has identified the digital and pedagogical competencies that pre-service
teachers need (Güneş & Bahçivan, 2018; Krumsvik, 2008; 2014) and examined undergraduates’ attitudes toward and beliefs about
digital literacy (Garcia-Martin & Garcia-Sanchez, 2017; Maranguni�c & Grani�c, 2015), we contribute to the literature by asking a more
fundamental question – how do pre-service teachers conceptualize the competence in digital literacy that they are expected to develop.
In doing so, we identify four distinct and discernible profiles of digital literacy conceptions. While aspects of these profiles have been
considered in frameworks of digital literacy (Eshet-Alkalai, 2012; Ng, 2012), this study is unique in establishing various skills as
defining the conceptions of digital literacy that pre-service teachers hold. These conceptions both echo trends with regard to tech
nology use and critical evaluation identified in prior work and correspond to a newly proposed trajectory for pre-service teachers’
development of conceptions of digital competence. This framework is able to inform both pre-service teacher preparation and
assessment in the area of digital literacy and, potentially, teachers’ own conceptions of activities related to digital literacy, as they
occur in their classrooms. Moreover, we both validate these four profiles of conceptions of digital literacy across diverse national
settings (i.e., in the United States and Sweden) and suggest potential pathways whereby unique national educational policy settings
may be associated with pre-service teachers’ conceptions, as reflected in the emphasis on critical evaluation found among Swedish
participants. As such, this study is unique in connecting individual pre-service teachers’ psychological conceptions of digital literacy
with broader national and educational trends.
15
A. List et al. Computers & Education 148 (2020) 103788
4.6. Limitations
Despite the strengths of this study, a number of limitations must be acknowledged. First, in this study our aim was to extend prior
work investigating pre-service teachers’ understandings of digital literacy (Burnett, 2011; Martinovic & Zhang, 2012) by explicitly
asking them to define and describe this construct. This methodological approach stands in contrast to survey and other self-report
measures that have tried to get at pre-service teachers’ conceptions of digital literacy indirectly, for instance by asking them to
self-report the ICT-activities that they use in the classroom (Hargittai, 2009; Prior, Mazanov, Meacheam, Heaslip, & Hanson, 2016).
Nevertheless, our methodological approach was limited in that it did not comprehensively tap all of pre-service teachers’ conceptions
of digital literacy or how these may be employed in classroom settings. Moreover, a methodology that asked participants to select those
skills that they considered to be essential for digital literacy, rather than rating the list of skills as a whole offered both benefits and
limitations. On the one hand, this allowed us to distill only those skills, from extensive taxonomies found in prior work (see van Laar,
van Deursen, van Dijk, & de Haan, 2017 for a review), that pre-service teachers considered to be essential for digital literacy and
require pre-service teachers to make judgments regarding skills’ relative importance, as they are often required to do in classroom
settings. On the other hand, asking teachers to evaluate or rate all skills represents an important direction for future work and one that
would allow for further scale development and validation.
Second, this study was limited by focusing only on one particular group: pre-service teachers, early in their educator preparation
programs. Even though this group was expected to be familiar with digital literacy, we may have received an incomplete picture of pre-
service teachers’ understandings of digital literacy, as we chose to focus on a group of students early in their teacher preparation
programs, typically not yet receiving much formal instruction in digital literacy and how it may be developed in students. Further
research should investigate changes in conceptions of digital literacy among pre-service teachers throughout their degree programs as
well as following their practicum placements in the classroom. More generally, replicating this study with a larger sample, representing
other majors and areas of study as well as national, cultural, and educational settings remains an important area for future work.
Finally, examining pre-service teachers’ conceptions of digital literacy with nationally representative samples remains an important
step for future work.
Third, a number of methodological differences emerged across administrations of the digital literacy survey in Sweden and the
United States. These differences included participants in Sweden being asked to select the six skills they consider to be essential for
digital literacy, rather than the five skills that participants were asked to select in the U.S., and the digital literacy items presented to
the Swedish sample not being randomized. Although we tried to address these issues in our analyses in various ways (e.g., using
proportions of those selecting particular digital literacy items, rather than percentages), these discrepancies may, nevertheless, have
biased the comparisons that we are able to make across settings and represent a substantial limitation. Moreover, there were a large
number of Swedish participants whose open-ended responses were placed into the “other” category. This pattern of categorization is
understandable given that Swedish participants were completing the study voluntarily, with no compensation, and at a time and
location of their choosing. Nevertheless, the volume of participants coded into the other category (19.00%) may be responsible for the
non-significant association between open-ended categories of digital literacy conceptions and cluster membership found in the
Swedish sample. This lack of an association may further have been exacerbated by the extremely low representation of Swedish
participants’ open-ended responses placed into the digital reading category (1.65%).
Finally, the cluster analysis undertaken in this study was a result of decision making, on the part of researchers. Limitations in
sample size and conceptual difficulties precluded our including all 24 digital literacy skills in a single cluster analysis; we therefore
considered it preferable to select a subset of items for inclusion. We selected those items that corresponded to the four digital literacy
profiles identified, based on students’ open-ended responses. Nevertheless, the clusters identified and their interpretation is likely to
change if a different subset of items were selected. For instance, we elected to exclude the item: “Searching for and selecting relevant
information”, despite it being selected by a large percentage of participants in both Sweden and the United States. We excluded this
item because of the frequency with which it was endorsed. To the extent that the goal of cluster analysis is to find meaningful yet
distinctive patterns in the data, the relative popularity of this item made its inclusion in cluster analysis unappealing from a statistical
standpoint. As a result of this methodological decision and others, we consider further developing and validating objective measures of
pre-service teachers’ digital literacy to be an important avenue for future work. For instance, Garcia-Martin and Garcia-Sanchez (2017)
examined pre-service teachers’ use and perceptions of various technology platforms (e.g., Twitter, Google Docs) across four years;
similar cross-sectional and longitudinal work can be used to examine the development of pre-service teachers’ conceptions of digital
literacy, if valid and reliable measures were developed.
4.7. Implications
Based on pre-service teachers’ answers to open-ended and selected-response questions, we propose a framework for understanding
different conceptions of digital literacy. We see pre-service teachers’ conceptions of digital literacy as falling along a continuum of
sophistication, from those who consider digital literacy to only be technology focused, to those who consider digital literacy to require
critically reflective technology use. This framework for categorizing conceptions of digital literacy is presented in Fig. 1.
As displayed in Fig. 1, each of the four categories of digital literacy conceptions is componential in nature, such that the critical use
category includes a greater volume of knowledge and skills, that are more sophisticated in nature, than other categories (e.g., tech
nology focused). This greater sophistication is associated with a more complex view of digital literacy, considering more components
of digital literacy and their inter-relation. Further, Fig. 1 demonstrates how a more sophisticated conception of digital literacy may
develop. Of course, students do not develop such dispositions automatically or only through repeated exposure to technology. Rather,
16
A. List et al. Computers & Education 148 (2020) 103788
purposeful and quality instruction is needed. In academic settings, we view students as moving from using technology only occa
sionally, when externally prompted to do so, to habitually, based on their own volition. Ultimately, students may internalize the use of
technology as a fundamental mechanism for learning and interacting with the world. This internalization of technology use may lead
students to develop the dispositions and habits of mind necessary for critical evaluative technology use.
Teachers, then, are tasked not only with critically evaluating technology for their own personal use but also for use in their
classrooms. Standards for pre-service teacher preparation, when describing critically evaluative technology use, ought to include both
the ability to evaluate sources and information quality and the skills needed to critically appraise apps, software, and social media for
classroom use. This may include understanding how apps are gamified to prolong engagement or how social media is used as a method
of personal and meta data collection (Selwyn and Pangrazio, 2018). Future teachers should know how to critically assess software or
applications used in their teaching and understand how their choice of technology impacts both their students’ academic experiences
in the classroom and their technological lives, online (e.g., what student information may be public or collected by corporations).
5. Conclusion
There exists a gap between how policy organizations (e.g., OECD, PISA, EU, and the American Library Association, 2018) and
pre-service teachers define digital literacy. Definitions introduced in policy documents reflect all four levels of digital literacy con
ceptions included in our proposed framework, while (too) many participants, from both Sweden and the United States, were found to
hold a more superficial understanding of digital literacy, focusing only on its technological aspects. This gap needs to be addressed by
educational institutions on all levels; however, addressing this gap within pre-service teacher preparation programs may serve to
translate more complex conceptions of digital literacy to K-12 classrooms. If pre-service teachers come to hold more sophisticated
conceptions of digital literacy, they may implement more sophisticated digital literacy activities into their instruction and engage their
students in critical reasoning about technology to a greater extent.
Author contributions
The first (Alexandra List) and second (Eva Brante) authors were responsible for paper conceptualization, data collection and
analysis, and manuscript composition and revision. The third author, Holly Klee, was responsible for data analysis, manuscript
conceptualization, composition, and revision.
References
Albion, P. R. (2001). Some factors in the development of self-efficacy beliefs for computer use among teacher education students. Journal of Technology and Teacher
Education, 9(3), 321–347. https://doi.org/10.18261/issn.1891-943x2016-02-01.
Alexander, P. A., The Disciplined Reading, & Learning Research, L. (2012). Reading into the future: Competence for the 21st century. Educational Psychologist, 47(4),
259–280.
American Library Association. (2018). Digital literacy. Retrieved from https://literacy.ala.org/digital-literacy/.
Andersson, P., & Sofkova-Hashemi, S. (2016). Screen-based literacy practices in Swedish primary schools. Nordic Journal of Digital Literacy, 11(2), 86–103.
Bates, R., & Khasawneh, S. (2007). Self-efficacy and college students’ perceptions and use of online learning systems. Computers in Human Behavior, 23(1), 175–191.
https://doi.org/10.1016/j.chb.2004.04.004.
Bawden, D. (2008). Origins and concepts of digital literacy. In C. Lankshear, & M. Knobel (Eds.), Digital literacies: Concepts, policies, and practices (2nd ed., pp. 17–32).
New York: Peter Lang Publishing nc.
Bergdahl, N., Fors, U., Hernwall, P., & Knutsson, O. (2018). The use of learning technologies and student engagement in learning activities. Nordic Journal of Digital
Literacy, 13(2). https://doi.org/10.18261/1891-943X-2018-02-04, 10.18261/1891-943X-2018-02-04 113-130.
Braasch, J. L., Bråten, I., Strømsø, H. I., Anmarkrud, Ø., & Ferguson, L. E. (2013). Promoting secondary school students’ evaluation of source features of multiple
documents. Contemporary Educational Psychology, 38(3), 180–195. https://doi.org/10.1016/j.cedpsych.2013.03.003.
Brand-Gruwel, S., Wopereis, I., & Walraven, A. (2009). A descriptive model of information problem solving while using internet. Computers & Education, 53(4),
1207–1217. https://doi.org/10.1016/j.compedu.2009.06.004.
Brante, E. W., & Stang Lund, E. (2017). Undervisning i en sammansatt textv€ arld: En intervjustudie med svenska och norska gymnasiel€ arare om kritisk l€asning och
kritisk v€
ardering av k€ allinformation. Nordic Journal of Literacy Research, 3, 1–18.
17
A. List et al. Computers & Education 148 (2020) 103788
Britt, M. A., & Aglinskas, C. (2002). Improving students’ ability to identify and use source information. Cognition and Instruction, 20(4), 485–522. https://doi.org/
10.1207/S1532690XCI2004_2.
Britt, M. A., Rouet, J. F., Blaum, D., & Millis, K. (2019). A reasoned approach to dealing with fake news. Policy Insights from the Behavioral and Brain Sciences, 6(1),
94–101. https://doi.org/10.1177/2372732218814855.
Brown, G. J. (2001). Beyond print: Reading digitally. Library Hi Tech, 19(4), 390–399. https://doi.org/10.1108/07378830110412456.
Buckingham, D. (2010). The future of media literacy in the digital age: Same challenges for policy and practice. Media Education Journal, 47, 3–10. http://discovery.
ucl.ac.uk/id/eprint/10006339.
Burnett, C. (2011). Pre-service teachers’ digital literacy practices: Exploring contingencyin identity and digital literacy in and out of educational contexts. Language
and Education, 25(5), 433–449. https://doi.org/10.1080/09500782.2011.584347.
Cai, J., & Gut, D. (2018). Literacy and digital problem-solving skills in the 21st century: What PIAAC says about educators in the United States, Canada, Finland, and
Japan. Teaching Education, 1–32. https://doi.org/10.1080/10476210.2018.1516747.
Chen, H. Y. (2011). Online reading comprehension strategies among fifth-and sixth-grade general and special education students. Education Research and Perspectives,
37(2), 79–109. https://search.informit.com.au/documentSummary;dn¼201112968;res¼IELAPA.
Chetty, K., Qigui, L., Gcora, N., Josie, J., Wenwei, L., & Fang, C. (2018). Bridging the digital divide: Measuring digital literacy. Economics: The open-access. Open-
Assessment E- Journal, 12(2018–23), 1–20. http://hdl.handle.net/10419/177899.
Coiro, J. (2003). Exploring literacy on the internet: Reading comprehension on the internet: Expanding our understanding of reading comprehension to encompass
new literacies. The Reading Teacher, 56(5), 458–464. https://www.jstor.org/stable/20205224.
Conole, G., De Laat, M., Dillon, T., & Darby, J. (2008). Disruptive technologies, pedagogical innovation: what’s new? Findings from an in-depth study of students’ use
and perception of technology. Computers & Education, 50(2), 511–524. https://doi.org/10.1016/j.compedu.2007.09.009.
Council for the Accreditation of Educator Preparation. (2018). CAEP 2018 K-6 Elementary teacher preparation standards. Washington, D.C. Retrieved from: http://
caepnet.org/~/media/Files/caep/standards/2018-caep-k-6-elementary-teacher-prepara.pdf?la¼en.
Council of the European Union. (2018). Council Recommendation of 22 May 2018 on key competences for lifelong learning (Text with EEA relevance. Official Journal
of the European Union, C 189, 4 June 2018. Retrieved from: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri¼OJ:C:2018:189:TOC.
Creswell, J. W., & Plano Clark, V. L. (2011). Designing and conducting mixed methods research (2nd ed.). Thousand Oaks, CA: Sage Publications.
Davis, T. J., South, J. B., & Stevens, K. D. (2018). Information and communication technology and educational policies in the United States of America and Canada. In
Second Handbook of information technology in primary and secondary education (pp. 1–22). Springer.
Ennis, R. H. (1989). Critical thinking and subject specificity: Clarification and needed research. Educational Researcher, 18(3), 4–10. https://doi.org/10.3102/
0013189X018003004.
Ertmer, P. A. (2005). Teacher pedagogical beliefs: The final frontier in our quest for technology integration? Educational Technology Research and Development, 53(4),
25–39. https://doi.org/10.1007/BF02504683.
Ertmer, P. A., Ottenbreit-Leftwich, A., Sadik, O., Sendurur, E., & Sendurur, P. (2012). Teacher beliefs and technology integration practices: A critical relationship.
Computers & Education, 59(2), 423–435. https://doi.org/10.1016/j.compedu.2012.02.001.
Eshet-Alkalai, Y. (2004). Digital literacy: A conceptual framework for survival skills in the digital era. Journal of Educational Multimedia and Hypermedia, 13(1),
93–106. https://www.learntechlib.org/primary/p/4793/.
Eshet-Alkalai, Y. (2012). Thinking in the digital era: A revised model for digital literacy. Issues in Informing Science and Information Technology, 9(2), 267–276. http://
iisit.org/Vol9/IISITv9p267-276Eshet021.pdf.
Eshet-Alkali, Y., & Amichai-Hamburger, Y. (2004). Experiments in digital literacy. CyberPsychology and Behavior, 7(4), 421–429. https://doi.org/10.1089/
cpb.2004.7.421.
European Commission. (2017). Digital competence framework for citizens (DigComp 2.1). Publications Office of the European Union. Retrieved from http://publications.
jrc.ec.europa.eu/repository/bitstream/JRC83167/lb-na-26035-enn.pdf.
Fives, H., & Buehl, M. M. (2012). Spring cleaning for the “messy” construct of teachers’ beliefs: What are they? Which have been examined? What can they tell us?. In
K. R. Harris, S. Graham, & T. Urdan (Eds.), APA educational psychology handbook: Individual differences and cultural and contextual factors (Vol. 2, pp. 471–499)
Washington, DC: American Psychological Association.
Fraillon, J., Ainley, J., Schulz, W., & Friedman, T. (2014). Preparing for life in a digital age: The IEA international computer and information literacy study international
report. Springer. Retrieved from https://link.springer.com/content/pdf/10.1007/978-3-319-14222-7.pdf.
Francke, H., & Sundin, O. (2012). Negotiating the role of sources: Educators’ conceptions of credibility in participatory media. Library & Information Science Research,
34(3), 169–175. https://doi.org/10.1016/j.lisr.2011.12.004.
Garcia-Martin, J., & Garcia-Sanchez, J. N. (2017). Pre-service teachers’ perceptions of the competence dimensions of digital literacy and of psychological and
educational measures. Computers & Education, 107, 54–67. https://doi.org/10.1016/j.compedu.2016.12.010.
Güneş, E., & Bahçivan, E. (2018). A mixed research-based model for pre-service science teachers’ digital literacy: Responses to “which beliefs” and “how and why they
interact” questions. Computers & Education, 118, 96–106. https://doi.org/10.1016/j.compedu.2017.11.012.
Hargittai, E. (2009). An update on survey measures of web-oriented digital literacy. Social Science Computer Review, 27(1), 130–137. https://doi.org/10.1177/
0894439308318213.
Hatlevik, O. E., Guðmundsd� ottir, G. B., & Loi, M. (2015). Digital diversity among upper secondary students: A multilevel analysis of the relationship between cultural
capital, self-efficacy, strategic use of information and digital competence. Computers & Education, 81, 345–353. https://doi.org/10.1016/j.compedu.2014.10.019.
Hatlevik, O. E., Throndsen, I., Loi, M., & Gudmundsdottir, G. B. (2018). Students’ ICT self-efficacy and computer and information literacy: Determinants and
relationships. Computers & Education, 118, 107–119. https://doi.org/10.1016/j.compedu.2017.11.011.
Hobbs, R., & Coiro, J. (2019). Design features of a professional development program in digital literacy. Journal of Adolescent & Adult Literacy, 62(4), 401–4099.
https://doi.org/10.1002/jaal.907.
Inan, F. A., & Lowther, D. L. (2010). Factors affecting technology integration in K-12 classrooms: A path model. Educational Technology Research and Development, 58
(2), 137–154. https://doi.org/10.1007/s11423-009-9132-y.
Joo, Y. J., Park, S., & Lim, E. (2018). Factors influencing preservice teachers’ intention to use technology: TPACK, teacher self-efficacy, and technology acceptance
model. Journal of Educational Technology & Society, 21(3), 48–59. https://www.jstor.org/stable/10.2307/26458506.
Kennedy, G., Judd, T., Dalgarno, B., & Waycott, J. (2010). Beyond natives and immigrants: Exploring types of net generation students. Journal of Computer Assisted
Learning, 26(5), 332–343. https://doi.org/10.1111/j.1365-2729.2010.00371.x.
Koltay, T. (2011). The media and the literacies: Media literacy, information literacy, digital literacy. Media, Culture & Society, 33(2), 211–221. https://doi.org/
10.1177/0163443710393382.
Krumsvik, R. J. (2008). Situated learning and teachers’ digital competence. Education and Information Technologies, 13(4), 279–290. https://doi.org/10.1007/s10639-
008-9069-5.
Krumsvik, R. J. (2011). Digital competence in the Norwegian teacher education and schools. H€ ogre Utbildning, 1(1), 39–51.
Krumsvik, R. J. (2014). Teacher educators’ digital competence. Scandinavian Journal of Educational Research, 58(3), 269–280.
Lai, P. C. (2017). The literature review of technology adoption models and theories for the novelty technology. JISTEM-Journal of Information Systems and Technology
Management, 14(1), 21–38. https://doi.org/10.4301/s1807-17752017000100002.
Lee, C., Yeung, A. S., & Cheung, K. W. (2019). Learner perceptions versus technology usage: A study of adolescent English learners in Hong Kong secondary schools.
Computers & Education, 133, 13–26. https://doi.org/10.1016/j.compedu.2019.01.005.
List, A. (2019). Defining digital literacy development: An examination of pre-service teachers’ beliefs. Computers & Education, 138, 146–158.
List, A., Alexander, P. A., & Stephens, L. A. (2017). Trust but verify: Examining the association between students’ sourcing behaviors and ratings of text
trustworthiness. Discourse Processes, 54(2), 83–104.
18
A. List et al. Computers & Education 148 (2020) 103788
List, A., Peterson, E. G., Alexander, P. A., & Loyens, S. M. (2018). The role of educational context in beliefs about knowledge, information, and truth: an exploratory
study. European Journal of Psychology of Education, 33(4), 685–705.
Lu, J., Yu, C. S., Liu, C., & Yao, J. E. (2003). Technology acceptance model for wireless Internet. Internet Research, 13(3), 206–222. https://doi.org/10.1108/
10662240310478222.
Ma, W. W. K., Andersson, R., & Streith, K. O. (2005). Examining user acceptance of computer technology: An empirical study of student teachers. Journal of Computer
Assisted Learning, 21(6), 387–395. https://doi.org/10.1111/j.1365-2729.2005.00145.x.
Maggioni, L., Riconscente, M. M., & Alexander, P. A. (2006). Perceptions of knowledge and beliefs among undergraduate students in Italy and in the United States.
Learning and Instruction, 16(5), 467–491. https://doi.org/10.1016/j.learninstruc.2006.09.006.
Mangen, A., Walgermo, B. R., & Brønnick, K. (2013). Reading linear texts on paper versus computer screen: Effects on reading comprehension. International Journal of
Educational Research, 58, 61–68. https://doi.org/10.1016/j.ijer.2012.12.002.
Maranguni�c, N., & Grani�c, A. (2015). Technology acceptance model: A literature review from 1986 to 2013. Universal Access in the Information Society, 14(1), 81–95.
https://doi.org/10.1007/s10209-014-0348-1.
Margaryan, A., Littlejohn, A., & Vojt, G. (2011). Are digital natives a myth or reality? University students’ use of digital technologies. Computers & Education, 56(2),
429–440. https://doi.org/10.1016/j.compedu.2010.09.004.
Martinovic, D., & Zhang, Z. (2012). Situating ICT in the teacher education program: Overcoming challenges, fulfilling expectations. Teaching and Teacher Education: An
International Journal of Research and Studies, 28(3), 461–469. https://doi.org/10.1016/j.tate.2011.12.001.
Mills, S. (2006). Using the internet for active teaching and learning! New Jersey: Pearson Publishing.
Muis, K. R., & Sinatra, G. M. (2008). University cultures and epistemic beliefs: Examining differences between two academic environments. In M. S. Khine (Ed.),
Knowing, knowledge and beliefs (pp. 137–150). Dordrecht: Springer. https://doi.org/10.1007/978-1-4020-6596-5_6.
Nasah, A., DaCosta, B., Kinsell, C., & Seok, S. (2010). The digital literacy debate: An investigation of digital propensity and information and communication
technology. Educational Technology Research and Development, 58(5), 531–555. https://doi.org/10.1007/s11423-010-9151-8.
Ng, W. (2012). Can we teach digital natives digital literacy? Computers & Education, 59(3), 1065–1078. https://doi.org/10.1016/j.compedu.2012.04.016.
Noh, Y. (2019). A comparative study of public libraries’ contribution to digital inclusion in Korea and the United States. Journal of Librarianship and Information
Science, 51(1), 59–77. https://doi.org/10.1177/0961000616668571.
Noru�sis, M. J. (2011). IBM SPSS statistics 19 guide to data analysis. Upper Saddle River, New Jersey: Prentice Hall.
Nygren, T., & Guath, M. (2019). Swedish teenagers’ difficulties and abilities to determine digital news credibility. Nordicom Review, 40(1), 23–42.
OECD. (2015). Students, computers, and learning: Making the connection. PISA: OECD Publishing. https://doi.org/10.1787/9789264239555-en. Retrieved from.
Pajares, M. F. (1992). Teachers’ beliefs and educational research: Cleaning up a messy construct. Review of Educational Research, 62(3), 307–332. https://doi.org/
10.3102/00346543062003307.
Peterson, E., & Alexander, P. A. (2020). Navigating Print and Digital Sources: Students’ Selection, Use, and Integration of Multiple Sources Across Mediums. The
Journal of Experimental Education, 88(1), 27–46.
Porat, E., Blau, I., & Barak, A. (2018). Measuring digital literacies: Junior high-school students’ perceived competencies versus actual performance. Computers &
Education, 126, 23–36. https://doi.org/10.1016/j.compedu.2018.06.030.
Porter, A., McMaken, J., Hwang, J., & Yang, R. (2011). Common core standards: The new US intended curriculum. Educational Researcher, 40(3), 103–116. https://
doi.org/10.3102/0013189X11405038.
Prior, D. D., Mazanov, J., Meacheam, D., Heaslip, G., & Hanson, J. (2016). Attitude, digital literacy and self efficacy: Flow-on effects for online learning behavior. The
Internet and Higher Education, 29, 91–97. https://doi.org/10.1016/j.iheduc.2016.01.001.
Rouet, J. F., & Britt, M. A. (2011). Relevance processes in multiple document comprehension. In M. T. McCrudden, J. P. Magliano, & G. Schraw (Eds.), Text relevance
and learning from text (pp. 19–52). Charlotte, NC: Information Age Publishing, Inc.
Salmer� on, L., Strømsø, H. I., Kammerer, Y., Stadtler, M., & van den Broek, P. (2018). Comprehension processes in digital reading. Learning to read in a digital world (pp.
91–120). John Benjamins Publishing Company.
Sarstedt, M., & Mooi, E. (2014). Cluster analysis. A concise guide to market research. Berlin, Heidelberg: Springer.
Scherer, R., Tondeur, J., Siddiq, F., & Baran, E. (2018). The importance of attitudes toward technology for pre-service teachers’ technological, pedagogical, and
content knowledge: Comparing structural equation modeling approaches. Computers in Human Behavior, 80, 67–80. https://doi.org/10.1016/j.chb.2017.11.003.
Schmid, R., & Petko, D. (2019). Does the use of educational technology in personalized learning environments correlate with self-reported digital skills and beliefs of
secondary-school students? Computers & Education, 136, 75–86. https://doi.org/10.1016/j.compedu.2019.03.006.
Schmidt, W. H., Wang, H. C., & McKnight, C. C. (2005). Curriculum coherence: An examination of US mathematics and science content standards from an
international perspective. Journal of Curriculum Studies, 37(5), 525–559. https://doi.org/10.1080/0022027042000294682.
Selwyn, N., & Pangrazio, L. (2018). Doing data differently? Developing personal data tactics and strategies amongst young mobile media users. Big Data & Society, 5
(1). https://doi.org/10.1177/2053951718765021.
SFS. (2018). Higher education ordinance (1993:100). Retrieved from: http://www.riksdagen.se/sv/dokument-lagar/dokument/svensk-forfattningssamling/
hogskoleforordning-1993100_sfs-1993-100.
Simard, S., & Karsenti, T. (2016). A quantitative and qualitative inquiry into future teachers’ use of information and communications technology to develop students’
information literacy skills. Canadian Journal of Learning and Technology, 42(5), 1–23.
Singer, L. M., & Alexander, P. A. (2017a). Reading across mediums: Effects of reading digital and print texts on comprehension and calibration. The Journal of
Experimental Education, 85(1), 155–172. https://doi.org/10.1080/00220973.2016.1143794.
Singer, L. M., & Alexander, P. A. (2017b). Reading on paper and digitally: What the past decades of empirical research reveal. Review of Educational Research, 87(6),
1007–1041. https://doi.org/10.3102/0034654317722961.
Skolverket [National Agency for Education]. (2018). Curriculum for the compulsory school, preschool class and the recreation centre. Revised 2017. Retrieved from:
https://www.skolverket.se/sitevision/proxy/publikationer/svid12_5dfee44715d35a5cdfa2899/55935574/wtpub/ws/skolbok/wpubext/trycksak/Blob/
pdf3975.pdf?k¼3975.
Sofkova Hashemi, S., & Cederlund, K. (2017). Making room for the transformation of literacyInstruction in the digital classroom. Journal of Early Childhood Literacy, 17
(2), 221–253. https://doi.org/10.1177/1468798416630779.
Spante, M., Sofkova Hashemi, S., Lundin, M., & Algers, A. (2018). Digital competence and digital literacy in higher education research: Systematic review of concept
use. Cogent Education. https://doi.org/10.1080/2331186X.2018.1519143.
Stordy, P. (2015). Taxonomy of literacies. Journal of Documentation, 71(3), 456–476. https://doi.org/10.1108/JD-10-2013-0128.
Stuart, C., & Thurlow, D. (2000). Making it their own: Preservice teachers’ experiences, beliefs, and classroom practices. Journal of Teacher Education, 51(2), 113–121.
https://doi.org/10.1177/002248710005100205.
Sundin, O. (2015). Invisible search: Information literacy in the Swedish curriculum for compulsory schools. Nordic Journal of Digital Literacy, 10(4), 193–209.
Teo, T., Lee, C. B., & Chai, C. S. (2008). Understanding pre-service teachers’ computerattitudes: Applying and extending the technology acceptance model. Journal of
Computer Assisted Learning, 24(2), 128–143. https://doi.org/10.1111/j.1365-2729.2007.00247.x.
Thompson, P. (2013). The digital natives as learners: technology use patterns and approaches to learning. Computers & Education, 65, 12–33. https://doi.org/10.1016/
j.compedu.2012.12.022.
Trakhman, L. M. S., Alexander, P. A., & Silverman, A. B. (2018). Profiling reading in print and digital mediums. Learning and Instruction, 57, 5–17. https://doi.org/
10.1016/j.learninstruc.2018.04.001.
Tsai, C. C. (2004). Beyond cognitive and metacognitive tools: The use of the Internet as an ‘epistemological’ tool for instruction. British Journal of Educational
Technology, 35(5), 525–536. https://doi.org/10.1111/j.0007-1013.2004.00411.x.
19
A. List et al. Computers & Education 148 (2020) 103788
Van Dijk, J. A. G. M. (2009). One Europe, digitally divided. In A. Chadwick, & P. N. Howard (Eds.), Routledge Handbook of Internet politics (pp. 288–305). New York,
NY: Routledge.
Visser, M. (2013). Digital literacy and public policy through the library lens. Maine Policy Review, 22(1), 104–113. https://digitalcommons.library.umaine.edu/mpr/
vol22/iss1/27.
Wagner, T. (2008). Even our “best” schools are failing to prepare students for 21st-century careers and citizenship. Educational Leadership, 66(2), 20–24.
20