0% found this document useful (0 votes)
20 views21 pages

Toraman Turk Et Al 2025 Measuring The Degree of Mixed Methods Adoption An Investigation Using Doctoral Dissertation

Uploaded by

Lanphuong Nguyen
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views21 pages

Toraman Turk Et Al 2025 Measuring The Degree of Mixed Methods Adoption An Investigation Using Doctoral Dissertation

Uploaded by

Lanphuong Nguyen
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 21

Empirical Research

Journal of Mixed Methods Research


2025, Vol. 0(0) 1–21
Measuring the Degree of Mixed © The Author(s) 2025
Article reuse guidelines:
Methods Adoption: An sagepub.com/journals-permissions
DOI: 10.1177/15586898241313245
journals.sagepub.com/home/mmr
Investigation Using Doctoral
Dissertation Abstracts

Sinem Toraman Turk1 , Kyle Cox2, and Vicki L. Plano Clark3 

Abstract
Prevalence studies have been conducted to assess the adoption of mixed methods research
(MMR) across contexts. A limitation of these studies for understanding MMR adoption is their
reliance on dichotomous operationalization or focus on one MMR practice. This study employed
an item response theory model to examine the measurement of MMR adoption with reported
MMR elements as indicators. Results indicate that MMR studies can be differentiated on their
degree of MMR adoption and that elements such as Integration techniques, Sequence, and
Qualitative and Quantitative Methods have a pronounced ability to differentiate MMR studies on
the degree they adopted MMR practices. This study contributes to the field of MMR by offering a
novel approach to operationalizing the degree of MMR adoption.

Keywords
adoption of mixed methods research, doctoral dissertation abstracts, item response theory,
mixed methods research practices, mixed methods as a field of research

The field of mixed methods research encompasses a body of knowledge and community of
scholars dedicated to advancing understanding and application of research that intentionally
combines quantitative and qualitative approaches (Plano Clark & Ivankova, 2016). Many scholars
in the field have focused on methodological issues, including the foundations, procedures, and
validity considerations associated with mixing methods (e.g., Hesse-Biber & Johnson, 2015;
Tashakkori & Teddlie, 2003, 2010a). Alongside this work, scholars have argued about social and

1
Department of Health Policy and Management, Yale School of Public Health, and Yale Global Health Leadership Initiative,
New Haven, CT, USA
2
Cato College of Education, University of North Carolina at Charlotte, Charlotte, NC, USA
3
School of Education, University of Cincinnati, Cincinnati, OH, USA

Corresponding Author:
Sinem Toraman Turk, Department of Health Policy and Management, Yale School of Public Health, and Yale Global Health
Leadership Initiative, 60 College Street, New Haven, CT 06520, USA.
Email: [email protected]
2 Journal of Mixed Methods Research 0(0)

communal facets of research practices and their influence on the adoption and use of mixed
methods research (Denscombe, 2008; Morgan, 2007). Scholars have examined historical and
social contexts that shape how mixed methods research has been adopted and used by researchers
and disciplines over time (e.g., Alise & Teddlie, 2010; Bazeley, 2015; Ivankova & Kawamura,
2010; Johnson & Gray, 2010; Maxwell, 2016). Questions about the adoption and use of mixed
methods are important for the field. Answers to these questions map out the current field of mixed
methods research, which shapes future practices, identifies gaps in methodological understanding
and procedures, and highlights areas of interest (Creswell, 2009; Mertens et al., 2016; Tashakkori
& Teddlie, 2010b).
The most common approach for studying the adoption and use of mixed methods has been
prevalence studies that assess the frequency of mixed methods elements as indications of mixed
methods practice within specific contexts (Molina-Azorin & Fetters, 2016). When studying
adoption of mixed methods research, researchers have described the frequency of mixed methods
elements as reporting practices within single disciplines (e.g., Coates, 2021; Fàbregues et al.,
2022; Granikov et al., 2020; O’Cathain et al., 2008; Pluye & Hong, 2014; Younas et al., 2019) and
funding agencies (e.g., Coyle et al., 2018; Plano Clark, 2010) and made comparisons among
multiple disciplines (e.g., Alise & Teddlie, 2010; Ivankova & Kawamura, 2010; Ross &
Onwuegbuzie, 2010). These prevalence studies typically define a sample of interest, deter-
mine what percentage of the sample used mixed methods, and describe some aspects of that use,
such as details of a mixed methods element (e.g., type of mixed methods design and type of
integration technique). Collectively, prevalence studies provide a panoramic picture of the fre-
quency that mixed methods elements have been adopted and used within specific contexts.
More recently, Morgan (2022) examined the adoption of mixed methods research by designing
a measure to assess the extent of adoption and comparing empirical articles published in the
Journal of Mixed Methods Research representing the core of field with those in other journals from
the Social Science Citation Index (SSCI) representing outside of the core of field. Morgan (2022)
focused on one mixed methods practice (i.e., integration of results/inferences) and operationalized
the measure in two ways: (a) Coding for the extent of integration in the results ranging from zero to
four (Levels 0, 1, 2, 3, and 4) and (b) examining the number of paragraphs devoted to integration
as an indication of mixed methods adoption. Although Morgan (2022) expanded the measurement
of mixed methods adoption beyond dichotomies (presence or absence), he described his approach
as “exploratory” (p. 4) and noted that more detailed approaches to measuring adoption of mixed
methods practices were likely needed.
Despite the contributions made by mixed methods prevalence and adoption studies, there are
still limitations to these approaches. First, although mixed methods prevalence rates are typically
well defined in these studies, the link between prevalence rates and the concept of mixed methods
adoption is generally not well articulated. For example, some researchers studying mixed methods
adoption through the prevalence of mixed methods elements (e.g., Coyle et al., 2018; Plano Clark,
2010) have used the term adoption as defined by Rogers (2003). Here, adoption is conceptualized
as a complex process where individuals gain knowledge about, try out, and implement new
behaviors and approaches. However, to what extent and how the prevalence of mixed methods
elements indicates the adoption of mixed methods research is unclear in prevalence studies.
Second, most of the mixed methods prevalence studies are limited by their dichotomous
operationalization of mixed methods adoption. Although operationalizing adoption of mixed
methods research as present or absent (i.e., dichotomously) is a form of measurement, it is a
measurement model that is overly simplified and makes several questionable assumptions. For
example, this approach assumes mixed methods research is either completely adopted or not and
that every study that adopts mixed methods does so equally. These assumptions fail to consider
variation in the degree to which mixed methods elements are adopted. Potentially more
Toraman Turk et al. 3

concerning, a dichotomous approach assumes every indicator of mixed methods adoption is


equivalent, which does not align with the strong emphasis made by mixed methods researchers on
integration as the core of mixed methods research (e.g., Bazeley, 2016; Bryman, 2006; Fetters &
Freshwater, 2015; Morgan, 2022; O’Cathain et al., 2007; Tashakkori et al., 2021). In addition,
typically, a simple standard of having adopted mixed methods or not (e.g., self-identified as mixed
methods research or “mixed methods” as a key word) is applied to studies that are examined in a
prevalence study.
Third, historically the adoption and use of mixed methods research was assessed through a
checklist with criteria, and the term level of acceptance (i.e., minimal, moderate, or major) of
mixed methods research was used within different disciplines (Creswell & Plano Clark, 2007).
Then, Vicki Plano Clark used the term adoption for describing the acceptance and use of mixed
methods research in two prevalence studies (Coyle et al., 2018; Plano Clark, 2010). However,
measures for the degree of mixed methods adoption are limited to Morgan’s (2022) exploratory
approach. Morgan (2022) demonstrated promise and value in being able to measure the extent of
adoption and use the measure to make comparisons; however, it is limited to only one mixed
methods element (i.e., integration of results/inferences). Identifying the nuances of multiple mixed
methods elements can reveal different relationships between individual mixed methods elements
and mixed methods adoption. This expanded understanding could provide a more comprehensive
and holistic picture of mixed methods adoption.
Therefore, the purpose of this study was to examine the measurement of mixed methods
adoption with reported mixed methods elements as indicators. The overarching research question
of this study was: In what ways are specific elements of mixed methods practice related to the
degree of mixed methods adoption? We employed an item response theory model to assess the
relationship between reported mixed methods elements and mixed methods adoption. We
conducted a secondary analysis of a data set generated in a previous prevalence study of the mixed
methods practices reported in abstracts of master’s thesis and doctoral dissertations (Toraman
et al., 2020).
This current work contributes to the field of mixed methods research by using a psychometric
approach to identify nuances in the relationship between individual mixed method elements and
mixed methods adoption. In addition, this process introduces a novel approach to operationalizing
the degree of mixed methods adoption and critically reflects on its utility and limitations.
Collectively, these contributions provide a more comprehensive and holistic picture of mixed
methods adoption. Furthermore, clearly defining degree of mixed methods adoption may help
scholars read and review mixed methods research studies by identifying focal elements, allow
comparisons as Morgan (2022) attempted to do, and track changes in the use of mixed methods
over time. Moreover, researchers interested in using mixed methods research may learn about the
elements that matter for demonstrating mixed methods adoption. With this study, our goal is to
ignite dialogue in the field of mixed methods research and encourage novel and critical thought
among researchers regarding the operationalization of mixed methods adoption in prevalence
studies. Specifically, we hope to motivate the field to move beyond dichotomous treatment of
mixed methods adoption (i.e., presence and absence).

Mixed Methods Adoption as a Construct


Three major assumptions form the foundation for this study’s operationalization of mixed methods
adoption as a construct, and these assumptions are explained within three subheadings: (a)
adoption of mixed methods research as an innovation, (b) connection between adoption of mixed
methods research and mixed methods prevalence, and (c) measuring the degree of mixed methods
4 Journal of Mixed Methods Research 0(0)

adoption. In light of these assumptions, we then explain the use of doctoral dissertation abstracts
for this analysis.

Adoption of Mixed Methods Research as an Innovation


Despite the long history of mixing different research methods (Maxwell, 2016), mixed methods
research has evolved as a new way of thinking with its methodological conventions in innovatively
addressing complex problems through the integration of qualitative and quantitative approaches
(Bazeley, 2018; Tashakkori et al., 2021). Therefore, mixed methods research is conceived as an
innovation in this study. This assumption is supported by two main points. First, previous studies
conceived mixed methods research as an innovation when examining mixed methods adoption
through the prevalence of mixed methods elements (i.e., Coyle et al., 2018; Plano Clark, 2010). As
defined by the diffusion of innovations theory, an innovation can be “an idea, practice, or object
that is perceived as new by an individual or other unit of adoption” (Rogers, 2003, p. 12). An
innovation may diffuse in a social system at different rates and ways that can be examined (Rogers,
2003). In the context of mixed methods research, social systems may consist of researchers’
surroundings, such as institutions, disciplines, and available mixed methods research guidelines,
because “the process of acquiring knowledge is social—it is learned through participation within
the group and through the adoption of shared practices” (Denscombe, 2008, p. 276). Second,
adoption of mixed methods research as an innovation can also be explained with Fujimura’s
(1988, 1996) concept of a bandwagon effect as discussed by Morgan (2022). Building upon
Fujimura’s (1988, 1996) description of cancer research formation starting as a bandwagon,
Morgan (2022) explains the exponential growth in the use of mixed methods research for the last
two decades by using the key term “mixed methods” in SSCI to describe how the conceptu-
alization of mixed methods research has attracted researchers and developed as a new field of
study across different contexts since late 1990s.

Connection Between Adoption of Mixed Methods Research and Mixed


Methods Prevalence
Adoption of mixed methods research by individuals and disciplines has been of interest to the field
to better understand the extent to which and how researchers use mixed methods in their research
practices (e.g., Coyle et al., 2018; Creswell & Plano Clark, 2018; Morgan, 2022; Plano Clark,
2010). Researchers who conduct mixed methods prevalence studies identify specific mixed
methods elements of interest discussed in the literature to examine in their analysis of research
documents (Howell Smith & Shanahan Bazis, 2021). Although there are varying perspectives
about which mixed methods elements should be examined in mixed methods prevalence studies,
we focused on 14 key mixed methods elements described in the mixed methods literature
(i.e., major sources used by graduate students and available to inform the abstracts being ex-
amined) and that collectively reflect mixed methods practice (i.e., Creswell & Plano Clark, 2018;
Coyle et al., 2018; Greene, 2007; Molina-Azorin & Fetters, 2016). We considered these 14 mixed
methods elements as possible indicators of mixed methods adoption. A list and description of each
element is provided in Table 1.
Adoption is described as “a decision to make full use of an innovation as the best course of
action available” (Rogers, 2003, p. 177), and an innovation involves researchers gaining
knowledge of the innovation, forming an attitude towards the innovation, and putting the in-
novation into practice after adopting it. We assume that researchers who have adopted mixed
methods research will include mixed methods elements when they apply this approach. From this
standpoint, reported mixed methods elements indicate the degree to which mixed methods
Toraman Turk et al. 5

Table 1. Definitions of 14 Key Elements Found in the Literature and Considered as Possible Indicators of
Mixed Methods Adoption.

Key Element Description

Mixed methods term in the title Using the term “mixed method” in the titlea,b
Mixed methods rationale Explaining why mixed methods is usedc,b
Mixed methods design Stating the type of mixed methods design used.a,c,b
Sequence (timing) Explicitly stating timing (e.g., simultaneous and sequential) of when
quantitative and qualitative components are implementedc,b
Priority Stating weighting or importance of qualitative and quantitative
components (e.g., qualitatively driven and equivalent status)c,b
Qualitative methods Stating the type(s) of qualitative methods (sampling, data collection,
analysis) useda,c
Quantitative methods Stating the type(s) of quantitative methods (sampling, data collection,
analysis) useda,c
Qualitative results Providing an overview of qualitative resultsc
Quantitative results Providing an overview of quantitative resultsc
Integration techniques Stating how qualitative and quantitative data were integrated.a,c,b
Mixed methods integrated results Providing an overview of mixed methods integrated resultsc,b
Paradigmatic assumption for mixed Identifying paradigms that embrace the value of mixing data collection
methods research and analytical strategiesc,d
Added value of using mixed methods Stating how mixed methods added value to what was learned.c,d,b
research
Advancing the field of mixed Stating the contribution of the study to the field of mixed methods
methods research researchc,b
a
Creswell and Plano Clark (2018).
b
Molina-Azorin and Fetters (2016).
c
Coyle et al. (2018).
d
Greene (2007).

research has been adopted. We postulate that the degree of mixed methods adoption varies from
limited adoption of mixed methods research to full adoption of mixed methods research based on
reported elements (see Figure 1). Placement on this adoption continuum can then be determined
through the presence of mixed methods elements. When examining the presence or absence of
these elements during investigations of mixed methods practice, researchers typically treat these
elements as equally important. This is an underexamined assumption that is not theoretically
supported by literature. In the context of this study, we assume that the inclusion of different mixed
methods elements represents differences in practice and each element (and subsequent practice)
may have varied importance when adopting mixed methods research.

Measuring the Degree of Mixed Methods Adoption


Conceptualizing the degree of mixed methods adoption as a construct measured by mixed
methods research elements allows the examination of (a) individual relationships between each
element and degree of mixed methods adoption and (b) variation among studies in the degree they
adopt mixed methods research. To examine the degree of mixed methods adoption, we used the
latent variable theory to operationalize mixed methods adoption as a latent construct. Under the
latent variable theory, variables can be classified into two categories based on their observability.
Manifest variables such as physical characteristics (e.g., height or weight) are directly observable
(de Ayala, 2009). In contrast, latent variables are not directly observable or measurable and thus
6 Journal of Mixed Methods Research 0(0)

Figure 1. Operationalization of the degree of mixed methods adoption. Note. MMR = Mixed methods
research. Asterisk (*) = MMR prevalence studies in the literature have focused on one or more of these
elements. In this study, we used all 14 elements that are described in the literature (e.g., Creswell and Plano
Clark (2018); Coyle et al. (2018); Greene (2007); Molina-Azorin & Fetters, 2016).

require the use of indicators to estimate their true value (de Ayala, 2009). Latent variables include
cognitive (e.g., memory or executive functioning) and noncognitive traits (e.g., motivation or self-
efficacy), all of which are measured using some type of indicator (e.g., observed outcomes, item
responses, or performance) (de Ayala, 2009). Latent variable models refer to a group of math-
ematical models that attempt to explain the relationship between latent variables (i.e., latent traits
or constructs) and their corresponding indicators (de Ayala, 2009). Latent variable models often
assume that the underlying latent construct is organized in an unobservable continuum that can be
assessed using a measurement model.
In the context of this study, we conceptualized mixed methods adoption as a manifest variable
and the degree of mixed methods adoption as a latent construct with reported elements of mixed
methods practice, serving as observable indicators. We considered 14 elements of mixed methods
practice described in the literature (see Table 1 and Figure 1) and examined if and how each
indicated the degree of mixed methods adoption.

Why Use Doctoral Dissertation Abstracts?


For this analysis, we applied our new operationalization of the degree of mixed methods adoption
to doctoral dissertation abstracts. Doctoral dissertations, and by extension their abstracts, represent
the culminating work of graduate students and the next generation of scholarly research. The
doctoral dissertation abstracts included in this analysis have been previously reviewed and scored
as part of a prevalence study (Toraman et al., 2020), making the dataset available for secondary
analysis. Using doctoral dissertation abstracts for this investigation has value because graduate
students, who conducted mixed methods dissertations, are likely to have the most up to date mixed
methods research training (McKim, 2017). In addition, graduate students are the ones who will
implement and develop mixed methods in the future (Mertens et al., 2016).
Doctoral dissertation abstracts also provide several practical advantages for this innovative
analysis. First, doctoral dissertation abstracts present the adoption of mixed methods elements
within a manageable length (≈ 350 words) that is more informative than a typical journal abstract
Toraman Turk et al. 7

(≈ 120–150 words). Second, doctoral dissertation abstracts represent doctoral students’ adoption
of abstract schemes varying across disciplines (Toraman et al., 2020). This variation across
disciplines allows a broader scope of mixed methods adoption to be considered. Third, doctoral
dissertation abstracts provide an accessible sample pool that is large enough for empirical in-
vestigations using advanced measurement approaches such as item response theory. Taken all
together, doctoral dissertation abstracts are a valuable and practical source of information about
mixed methods research and methodologically suitable to examine a new operationalization of the
degree of mixed methods adoption, using reported practices as observable indicators of an
underlying latent construct.

Methods
To address our research question, we conducted a secondary analysis of an existing dataset of
doctoral dissertation abstract scores. Specifically, we applied and compared two analytic ap-
proaches (i.e., simple descriptive statistics and an item response theory model) to assess the ways
specific elements of mixed methods practice relate to the degree of mixed methods adoption. In the
following sections, we provide an overview of the dataset for secondary analysis and describe our
detailed analyses and results for the two approaches.

Dataset for Secondary Analysis


The existing dataset of abstract scores for this study is a subset of a previous study that examined
the prevalence of mixed methods research in the abstracts of master’s theses and doctoral dis-
sertations that were published between 2013 and 2018 and labeled as mixed methods research by
graduate students in the ProQuest Dissertations and Theses GlobalTM database (Toraman et al.,
2020). The current secondary analysis was conducted on the sample of 800 doctoral dissertation
abstracts identified in the previous study that included a range of fields and disciplines (>a total of
40). The presence of each of the 14 mixed methods elements described in Table 1 was scored
(yes = 1/no = 0) for each abstract. Three researchers conducted scoring to ensure valid and reliable
scores, and the process has been described in detail elsewhere (Toraman et al., 2020).

Descriptive Statistics Methods and Results


We began the investigation of the degree of mixed methods adoption by considering the inclusion rate
of the 14 mixed methods elements. As noted, we treat these elements identified in the original
prevalence study (Toraman et al., 2020) as possible indicators of mixed methods adoption. We refer to
these elements as items to reflect typical practices from the measurement literature. As depicted in
Figure 2, the inclusion rate of items across doctoral dissertation abstracts varied greatly across items
from 85% to <1%. Noteworthy results include the extremely low inclusion rate of the Paradigmatic
assumption for mixed methods (0.8%) and Advancing the field of mixed methods research (0.5%) items
(see Figure 2). These items were included in just six and four doctoral dissertation abstracts, re-
spectively. Conversely, no items had extremely high inclusion rates (i.e., >99%). Qualitative methods
(85.1%) was the most common item included in the doctoral dissertation abstracts closely followed by
Quantitative methods (85.0%). The variation in items included within the doctoral dissertation ab-
stracts suggests varying degrees of mixed methods adoption.
A simplistic measure of degree of mixed methods adoption is a sum or total score, which is the
total number of mixed methods items included in each doctoral dissertation abstract. We cal-
culated sum scores for each dissertation abstract by assigning one (1) point if the mixed methods
research element was included and zero (0) point if the item was missing or incomplete. The
8 Journal of Mixed Methods Research 0(0)

distribution of sum scores in Figure 3 provides an illustration of variation in sum scores. Scores
ranged from a minimum of 0 to a maximum of 14 and followed an approximately normal
distribution (M = 5.6, SD = 2.29). These sum scores provide a rudimentary representation of the
degree of mixed methods adoption. While simple to understand, sum scores make the unwarranted
assumption that each item is equally important or provides the same amount of information about
the degree of mixed methods adoption. For example, inclusion of integration techniques carries

Figure 2. Inclusion rate and frequency of 14 key mixed methods research elements in doctoral dissertation
abstracts (N = 800) according to descriptive statistics.

Figure 3. Frequency of sum scores based on 14 key mixed methods research items (N = 800) according to
descriptive statistics.
Toraman Turk et al. 9

the same weight (i.e., one point) as including some form of mixed methods in the title of the
doctoral dissertation. This equivalence assumption, however, does not align with the strong
emphasis made by mixed methods researchers on integration as the core of mixed methods
research (e.g., Bazeley, 2016; Bryman, 2006; Creswell & Plano Clark, 2018; Fetters &
Freshwater, 2015; Morgan, 2022; O’Cathain et al., 2007; Tashakkori et al., 2021). Therefore,
it is important to describe the relationships among these items and empirically investigate their
relationship to mixed methods adoption.
Given the dichotomous scale of the items (i.e., present = 1/absent = 0), we found the tetrachoric
correlations between all items (see Supplemental Materials Table 1). We also examined a covariance
matrix including all items (see Supplemental Materials Table 2). Two critical results for subsequent
item response theory analysis emerged from these initial descriptive statistics. First, the extremely
small inclusion rate of Paradigmatic assumption for mixed methods (0.8%) and Advancing the field of
mixed methods research (0.5%) led to near zero variance and subsequent near zero covariances
between these two low frequency items and all other items. Although Paradigmatic assumption for
mixed methods and Advancing the field of mixed methods research may have theoretical merits for
inclusion, they do not function well in this empirical analysis. To avoid undermining the item response
theory analysis, both items were excluded from consideration they are not suitable indicators for
measuring the degree of mixed methods adoption in an item response theory model. We further
explored the 10 rare cases that included Paradigmatic assumption for mixed methods and Advancing
the field of mixed methods research items (see Table 2) and noted that presence of some mixed methods

Table 2. Presence of Other Mixed Methods Items for the 10 Rare Cases including Paradigmatic Assumption
for Mixed Methods and Advancing the Field of Mixed Methods Research Items.

Two Items With Small Inclusion Rate (n) Presence of Other Mixed Methods Items (n)

Paradigmatic assumption for mixed methods (n = 6) Mixed methods design (n = 6)


Qualitative methods (n = 5)
Quantitative methods (n = 5)
Mixed methods term in the title (n = 4)
Sequence (timing) (n = 4)
Qualitative results (n = 4)
Mixed methods integrated results (n = 4)
Integration techniques (n = 3)
Quantitative results (n = 3)
Mixed methods rationale (n = 2)
Priority (n = 0)
Added value of using mixed methods research (n = 0)
Advancing the field of mixed methods research (n = 4) Mixed methods design (n = 4)
Qualitative methods (n = 4)
Mixed methods term in the title (n = 3)
Quantitative methods (n = 3)
Sequence (timing) (n = 3)
Integration techniques (n = 3)
Qualitative results (n = 3)
Mixed methods integrated results (n = 3)
Mixed methods rationale (n = 2)
Quantitative results (n = 2)
Added value of using mixed methods research (n = 1)
Paradigmatic assumption for mixed methods (n = 0)
Priority (n = 0)
10 Journal of Mixed Methods Research 0(0)

items varied across the 10 cases that included these two items (see Table 2). For example, Priority was
absent in all 10 cases, and Added value of using mixed methods research was found in only one of
the 10.
The second critical result from the initial descriptive statistics involved highly correlated items.
We found the Quantitative and Qualitative methods item pair and Quantitative and Qualitative
results item pair so highly correlated that they caused a not positive definite correlation matrix
(r ¼ 0:86 and r ¼ :77, respectively). The not positive definite correlation matrix prevents the
subsequent use of item response theory approaches. From a conceptual perspective, a dissertation
abstract that included the Quantitative methods item almost always included the Qualitative
methods item. Essentially these two elements function as a single indicator. We found remarkably
similar results for the Quantitative results and Qualitative results items. To address this issue, we
combined these item pairs into single items. The combination of Quantitative and Qualitative
methods was transformed into the QQ Methods item, and the combination of Quantitative and
Qualitative results was transformed into the QQ Results item. If an abstract was missing one or
both items from the item pair, the new item (i.e., QQ Methods or QQ Results) was coded as zero
(0). Conversely, if the abstract included both methods items or both results items, QQ Methods and
QQ Results were coded as one (1). These adjustments to our mixed methods indicators produced a
final list of 10 items suitable for our analysis (see Table 3). Specifically, after these adjustments we
can consider item specific relationships with mixed methods adoption using an item response
theory model.

Item Response Theory Analysis Methods and Results


Using item response theory allows us to investigate specific relationships between the 10 mixed
methods research items and degree of mixed methods adoption. Item response theory mea-
surement models focus on the responses to individual items (i.e., mixed methods elements) and the
individual item’s relationship to the underlying construct (i.e., degree of mixed methods adoption)
using R (R Development Core Team, 2018). We examined the difficulty of item inclusion and the
ability of items to differentiate cases being measured (i.e., doctoral dissertation abstracts) on the
construct of interest (i.e., degree of mixed methods adoption). The results from the item response

Table 3. Descriptive Statistics and Correlations for Final 10 Indicators of Mixed Methods Adoption.

Items

%
n Inclusion 1 2 3 4 5 6 7 8 9 10

Mixed methods design 800 83.9 - - - - - - - - - -


QQ methods 800 80.5 0.28 - - - - - - - - -
QQ results 800 53.4 0.20 0.48 - - - - - - - -
Sequence (timing) 800 46.4 0.40 0.40 0.24 - - - - - - -
Mixed methods term in the 800 38.6 0.37 0.18 0.21 0.26 - - - - - -
title
Mixed methods integrated 800 31.5 0.27 0.39 0.33 0.33 0.27 - - - - -
results
Integration techniques 800 30.0 0.25 0.48 0.20 0.63 0.28 0.57 - - - -
Mixed methods rationale 800 24.6 0.47 0.04 0.01 0.09 0.08 0.14 0.25 - - -
Added value 800 5.9 0.25 0.13 0.13 0.03 0.21 0.15 0.28 0.25 - -
Priority 800 3.3 0.19 0.15 0.05 0.51 0.09 0.04 0.40 0.22 0.09 -
Toraman Turk et al. 11

theory analysis indicate the association between mixed methods items and the degree to which
mixed methods research is adopted. More specifically, results provided information about the
differences between individual items and an item’s contribution to the measurement of mixed
methods adoption.

Item Response Theory Concepts


Before detailing the item response theory model considered in our analysis, a brief conceptual
overview is beneficial for understanding the process and results. The core of item response theory
rests on the probability of including an item and its connection to an underlying related construct
(de Ayala, 2009). Theoretically, more of this construct increases the probability of inclusion while
decreasing amounts of the construct reduce the probability of inclusion. In our study, we expect as
the degree of mixed methods adoption increases, the likelihood of including more mixed methods
items also increases. Conversely, the likelihood of mixed methods items being present is lower
when mixed method studies have adopted mixed methods research in a more limited fashion.
Relatedly, more difficult items require more of the underlying construct to consistently include,
while easier items require smaller amounts of the underlying construct to achieve consistent
inclusion. In our study, we expected mixed methods items that require deeper and more com-
prehensive understanding of mixed methods practices to only be included in those studies that
have fully adopted mixed methods research. Conversely, mixed methods items that require little
underlying knowledge of mixed methods practices are likely to be present in studies that have
adopted mixed methods to a very limited degree.
A typical assumption of many item response theory models is unidimensionality or the as-
sumption that a single construct is being measured (de Ayala, 2009). Item response theory models
have been extended to accommodate multiple constructs or traits. These multidimensional item
response theory models can include multiple constructs (i.e., traits or factors) to explain item
response but are more complex and require appropriate theoretical support. In the context of this
study, we treat mixed methods adoption as a unidimensional latent construct. In the background
section (see Mixed Methods Adoption as a Construct), we presented the theoretical evidence to
support this assumption. Treating adoption of mixed methods as a unidimensional latent construct
allows us empirically to investigate several aspects of the relationship between mixed methods
adoption and individual mixed methods research items. Specifically, item response theory models
typically involve three key parameters: (a) item difficulty, (b) item discrimination, and (c) a
guessing parameter (de Ayala, 2009).
A difficulty parameter represents the level of the underlying construct at which there would be a
50% chance of inclusion or endorsement of an item. In our context, the difficulty parameter
represents the degree of mixed methods adoption at which there would be a 50% chance of
inclusion in the dissertation abstract. Essentially, the difficulty parameter describes how hard it is
to include each mixed methods research item in the dissertation abstract. The discrimination
parameter represents the capacity of an item to differentiate cases with various levels of the latent
construct. In our context, discrimination values indicate how well each mixed methods research
item can differentiate doctoral dissertation abstracts by degree of mixed methods adoption. This
discrimination parameter may take on positive and negative values with larger values suggesting
more capability and smaller values indicating less capability to differentiate abstracts. In this
capacity, the discrimination parameter captures the strength of the relationship between the mixed
methods research item and latent construct (i.e., the degree of mixed methods adoption). Which of
these parameters are estimated and if they are restricted or free to vary across items are common
distinguishing features of different item response theory models. The guessing parameter is
applicable in testing and multiple-choice settings, so no further discussion is needed.
12 Journal of Mixed Methods Research 0(0)

Item Response Theory Model


We considered several progressively less restrictive item response theory models: the Rasch
model, the one-parameter logistic model (1pL), and the two-parameter logistic model (2pL) along
with a multidimensional item response theory models (de Ayala, 2009). Using a combination of
criteria including information criterion indices, likelihood ratio tests, model parsimony, and
interpretability, we found the 2pL model to be the most appropriate (see Table 4). First, the 2pL
model had significantly better model fit over the Rasch model based on information criterion and a
likelihood ratio test. Second, we considered a multidimensional 2pL with a second factor
consisting of the Mixed method rationale, Mixed methods design, and Added value of mixed
methods items. This multidimensional item response theory model only provided minor im-
provements to model fit indices, but the additional factor and related items had no discernable
pattern. In other words, utilization of a multidimensional item response theory model produced
minor improvements in model performance, but the factors identified under mixed methods
adoption were not clearly defined because the associated mixed methods items seem unrelated.
The selected 2pL model reflects the unidimensional adoption of mixed methods construct
theorized in literature, which is best for interpretability. The 2pL model also provides a variety of
pertinent information about the relationship between mixed methods items and mixed methods
adoption. Specifically, a 2pL model allows both the difficulty and discrimination parameter to
vary. Although model fit indices for the 2pL model are not strong, they do meet or approach
minimal thresholds (see Table 4). The totality of theoretical and empirical evidence justifies using
the 2pL model, but we discuss poor model performance and its relation to the complexity of
capturing mixed methods adoption in the limitations section.

Item Response Theory Results


Results from our 2pL model are presented in Table 5. These results show that mixed methods
items varied in difficulty to include and the capacity to indicate the degree of mixed methods
adoption (i.e., discrimination).
Difficulty parameter results showed that Added value of using mixed methods (Difficulty
value = 4.77) and Priority (Difficulty value = 3.86) were more difficult items, whereas Mixed
methods design (Difficulty value = 1.91) and QQ Methods (Difficulty value = 1.37) were
easier, meaning likely to be present in an abstract even with limited mixed methods adoption. As
depicted in Figure 4(a), we plotted a curve of the item characteristics and used Mixed methods
design (solid line) and Priority (dashed line) items to illustrate items with low and high difficulty,
respectively. The Mixed methods design item crosses the 50% probability of inclusion line at
approximately 1.91, whereas the more difficult Priority item crosses the 50% probability line
around 3.86. The gap between these items represents the difference in adoption of mixed methods

Table 4. Comparison of Different IRT Models’ Performance With Mixed Methods Research Items.

Model logLik AIC BIC RMSEA SRMSR TLI CFI

Rasch 3910.24 7842.47 7894.00 0.07 0.08 0.77 0.78


2pL 3861.58 7763.16 7856.85 0.06 0.05 0.85 0.89
Multidimensional 2pL 3850.12 7742.23 7840.61 0.05 0.04 0.89 0.91
Note. logLik = Log-likelihood; AIC = Akaike Information Criterion; BIC = Bayesian Information Criterion; RMSEA = Root
Mean Square Error of Approximation; SRMSR = Standardized Root Mean Squared Residual; TLI = Tucker–Lewis Index;
CFI= Comparative Fit Index.
Toraman Turk et al. 13

Table 5. Difficulty and Discrimination Parameter Results From Item Response Theory for the Items
Measuring the Degree of Mixed Methods Adoption Construct (N = 800).

2pL

Item Difficulty (SE) Discrimination (SE)

Mixed methods term in the title 0.74 (0.15) 0.69 (0.11)


Mixed methods rationale 2.62 (0.61) 0.45 (0.11)
Mixed methods design 1.91 (0.24) 1.03 (0.16)
Sequence 0.13 (0.06) 1.62 (0.20)
Priority 3.86 (0.85) 0.99 (0.27)
Integration techniques 0.61 (0.06) 2.84 (0.49)
Mixed methods integrated results 0.77 (0.09) 1.35 (0.17)
Added value of using mixed methods 4.77 (1.35) 0.62 (0.19)
QQ methods 1.37 (0.14) 1.38 (0.20)
QQ results 0.21 (0.12) 0.71 (0.11)
Note. QQ = Qualitative and Quantitative; SE = Standard error.

Figure 4. Plot of item difficulty example for mixed methods design and priority items (a) and discrimination
example for integration technique and mixed methods title items based on item (b) response theory
results.

research associated with inclusion of the item. In other words, even abstracts that indicated a
limited degree of mixed methods adoption commonly included Mixed methods design, whereas
only doctoral dissertation abstracts representing a full adoption of mixed methods research in-
cluded a mention of Priority. The results suggest that abstracts representing fully or almost fully
adopted mixed methods research would almost certainly include Mixed methods design and QQ
Methods elements (less difficult items) and are likely but not certain to include items such as
Added value of using mixed methods and Priority elements (more difficult items).
The discrimination parameter estimates are provided in Table 5. Results from the 2pL model
suggested the Integration techniques item (Discrimination value = 2.84) had the greatest ca-
pability to differentiate doctoral dissertation abstracts on the degree they adopted mixed methods
research. This result suggests that the Integration techniques item has a notable ability to identify
doctoral dissertation abstracts that have more fully adopted mixed methods research. This is a
14 Journal of Mixed Methods Research 0(0)

highly valuable result as it provides strong empirical evidence to support integration of methods as
a core component of mixed methods research methodology (e.g., Bazeley, 2016; Fetters et al.,
2013). Other elements with large discrimination values were also related to integration or the dual
methods utilized in mixed methods research including Sequence (Discrimination value = 1.62),
Mixed methods integrated results (Discrimination value = 1.35), and QQ Methods (Discrimination
value = 1.38). Including a rationale for mixed methods (i.e., Mixed methods rationale item) had
the lowest discrimination value (0.45) suggesting that it is less capable of differentiating abstracts
with varying degrees of mixed methods adoption.
As depicted in Figure 4(b), we plotted a curve of the item characteristics and use Integration
technique (dashed line) and Mixed methods title (solid line) items to illustrate items with high and
low discrimination, respectively. Discrimination is illustrated in the slope of the item curves. It is
noteworthy that the Integration technique curve has a steep slope, whereas the curve for Mixed
methods title is much more gradual. This slope represents the probability of including the mixed
methods research practice. The degree of mixed methods adoption needs to increase a substantial
amount to see a large change in the probability of including Mixed methods title, whereas the
nearly vertical slope of Integration technique indicates that the inclusion of Integration technique
is very unlikely when the degree of mixed methods adoption is less than average (theta = 0) but it is
very likely to be present when the degree of adoption is at or above average. This ability to
distinguish or discriminate between degrees of mixed methods adoption is an important trait in a
mixed methods item. For example, the presence of Integration technique can serve as a signal to
identify studies (or doctoral dissertations) that have more fully adopted mixed methods research
paving the way for more nuanced considerations in mixed methods research prevalence studies.

Discussion
The purpose of this study was to examine the measurement of mixed methods adoption with
reported mixed methods elements as indicators. We employed an item response theory model to
assess the relationship between reported mixed methods elements (i.e., items) and mixed methods
adoption. Descriptive statistics indicated variation in item inclusion rates and sum scores giving
credence to an operationalization of mixed methods adoption that varies by degree on a continuum
of limited adoption to full adoption. Subsequent item response theory analysis included 10 key
mixed methods elements as items and focused on individual relationships between each element
and degree of mixed methods adoption as well as variation among studies in the degree they adopt
mixed methods research. A 2pL model indicated Integration techniques, Sequence, and QQ
Methods items had a notable ability (i.e., high discrimination values) to identify doctoral dis-
sertation abstracts likely to have adopted mixed methods research to a full extent. Priority and
Added value of using mixed methods were difficult items that are only likely to be present when
studies have adopted mixed methods research to a full extent. These results suggest that prev-
alence studies that treat adoption of mixed methods dichotomously have some limitations in
explaining the adoption of mixed methods research. In addition, our results showed that mixed
methods elements each have varying degrees of value in explaining mixed methods adoption and
should therefore not be treated as equally important.
As previously explained, we assumed that the inclusion of mixed methods elements as research
practices was related to the degree of mixed methods adoption construct. The results of this study
confirm this assumption with two nuances. First, instead of 14 elements of mixed methods
practice, the mixed methods adoption construct was examined using 10 items. Qualitative and
quantitative methods and qualitative and quantitative results items were found to be so highly
correlated that these item pairs were combined into single items. Second, Paradigmatic as-
sumption for mixed methods and Advancing the field of mixed methods research items were not
Toraman Turk et al. 15

suitable indicators for measuring the degree of mixed methods adoption construct due to their
extremely limited use in mixed methods studies (i.e., mixed methods abstracts in this study). This
is not to say these elements are not related to mixed methods adoption; however, they were not
suitable to measure the degree of mixed methods adoption when using doctoral dissertation
abstracts in this study. The limited use of Paradigmatic assumption for mixed methods is
consistent with the relevant literature (e.g., Alise & Teddlie, 2010; Coates, 2021; Younas et al.,
2019), and Advancing the field of mixed methods research as a mixed methods practice may not be
applicable to most of mixed methods studies unless the study brings about a methodological
innovation (Creswell & Plano Clark, 2018).
This is the first empirical study that provides evidence for individual relationships between
each mixed methods element and degree of mixed methods adoption as well as variation among
studies in the degree they adopt mixed methods research. Although Morgan (2022) has moved the
discussion about prevalence studies beyond the dichotomous operationalization of the mixed
methods adoption by examining the extent of integration with levels and the number of paragraphs
devoted to integration as an indication of mixed methods adoption, he only focused on integrated
results. In this study, however, we measured multiple mixed methods elements and made a clear
distinction between the use of integration techniques and integration results because simply
indicating an integration technique in a study does not warrant whether the qualitative and
quantitative results have actually been integrated. Therefore, our measure is more nuanced than
Morgan’s (2022) approach to assessing mixed methods adoption as it provides a holistic per-
spective of mixed methods adoption. It should, however, be noted that we used dissertation
abstracts in our measurement due to their suitability for testing our assumptions, which may be
deemed as being limited in explaining the adoption of mixed methods research since dissertation
abstracts reflect disciplinary variations in schemes and priorities along with institutional
guidelines about what to put in an abstract with word limits.

Implications
This study has several implications for researchers studying mixed methods adoption, doctoral
students conducting mixed methods studies and their advisors, and scholars interested in adopting
mixed methods research and getting on “the bandwagon of mixed methods” (Morgan, 2022, p. 7).
Given that this analysis provides empirical evidence about the nuanced value of each mixed
methods element, researchers studying mixed methods adoption ought to consider extending
prevalence studies beyond the description of the landscape of the mixed methods practices and
apply more advanced analyses to measure the degree of mixed methods adoption. In addition,
researchers studying mixed methods adoption need to develop standards for the review of mixed
methods research by taking the nuanced value of each mixed methods element into account rather
than treating them as being equally important. Because limitations of the adoption of mixed
methods research may be rooted in graduate education and how doctoral students learn about
mixed methods research (Toraman, 2021), researchers studying mixed methods adoption should
also consider examining mixed methods adoption between groups, over time, and in different
contexts to explain the root cause(s) of the limitations in adopting mixed methods research and to
better address existing disparities by accounting for different contexts. This idea also supports
Morgan’s (2022) recommendation about developing “the core curriculum for graduate education”
(p. 7) and the need for early guidance on how to adopt and use mixed methods research.
Doctoral students conducting mixed methods studies are encouraged to follow the guidelines
and standards for mixed methods research developed by mixed methods researchers (e.g.,
Creswell & Plano Clark, 2011, 2018; DeCuir-Gunby & Schutz, 2017; Fetters, 2020). A recent
study has showed that doctoral students, who used mixed methods in their dissertations, received
16 Journal of Mixed Methods Research 0(0)

varying degrees of support in learning about mixed methods during their doctoral education, and
their advisors and committee members (i.e., faculty members) played important roles in their
adoption and use of mixed methods research (Toraman, 2021). Therefore, we recommend advisors
and faculty members to be open to their doctoral students’ use of mixed methods research and
facilitate the learning about mixed methods in a mutual way to be able to make full use of mixed
methods and follow best practices. Finally, scholars interested in adopting mixed methods
research and getting on the mixed methods bandwagon are recommended to understand the history
and foundations of mixed methods research and follow the advances in the field, as the field has
grown tremendously since its establishment—and we expect it to continue doing so (Fetters &
Molina-Azorin, 2017). Further, scholars should consider that there are contextual influences,
different perspectives, and debates concerning mixed methods methodological considerations
(Hesse-Biber & Johnson, 2015; Plano Clark & Ivankova, 2016).

Contributions to the Field of Mixed Methods Research


This study contributes to the field of mixed methods research by providing empirical evidence and an
understanding of how mixed methods elements each have varying importance and play different roles
in the degree of mixed methods adoption. We expand prevalence studies beyond their dichotomous
operationalization of the degree to which mixed methods is adopted and offer a new way of assessing
the adoption of mixed methods that reflects the value of each mixed methods element. Despite the
strong emphasis about the use of mixed methods elements in the literature, with particular importance
on integration (Bryman, 2006; Creswell & Plano Clark, 2018; Fetters et al., 2013; Tashakkori et al.,
2021), to date, there has been no empirical evidence regarding the nuances about many different
mixed methods elements. The results of this study suggest that mixed methods elements have varying
importance for the degree of mixed methods adoption and should not be treated equally.
We admit that our method is more complex than methods typically applied in prevalence studies.
However, we argue that this analytic approach provides a much better reflection of the actual
complexity of mixed methods adoption. Our goal was to encourage novel and critical thought among
researchers regarding the operationalization of mixed methods adoption in prevalence studies. We
hope the empirical evidence provided with the application of an advanced measurement approach in
this study will encourage researchers to develop other ways of examining the degree of mixed methods
adoption to move this discussion beyond current approaches. We also hope the empirical evidence
generated by this analysis will inform the field about the relative levels of adoption for different
elements as well as help researchers and mixed methods instructors identify what mixed methods
elements to pay attention to in their mixed methods studies.

Limitations and Future Directions


We acknowledge that the nature of the results presented in this study warrants self-reflection
focusing on the strength and weakness of the analysis as well as thought-provoking questions for
future directions in the field of mixed methods research (see Table 6). As our goal was to ignite
dialogue in the field of mixed methods research and encourage novel and critical thought among
researchers regarding the operationalization of mixed methods adoption in prevalence studies, we
invite others to challenge us and the field to expand this work with ongoing dialogue and in new
ways to move prevalence studies beyond their dichotomous operationalization of the degree to
which mixed methods research is adopted.
This study was subject to several limitations. First, we used doctoral dissertation abstracts that
might be limited in reflecting the adoption of mixed methods research given that writing a doctoral
dissertation abstract is generally the last step and often not guided or well-thought (Belcher, 2009;
Toraman Turk et al. 17

Table 6. Questions for Strength and Weakness of the Analysis and Future Directions in the Field of Mixed
Methods Research.

Strength: Item Performance Weakness: Overall Model Performance

• Given the high discrimination value associated with • How well do dissertation abstracts serve as a
Integration techniques, should that item be used to identify proxy for the quality of a report?
mixed methods studies for prevalence studies?
• Are items with high difficulty values underemphasized in • Is adoption a generalized construct?
mixed methods research instruction?

Overall questions for future direction in the field

• Is adoption a generalized construct? Is it discipline, venue, or time dependent?


• Can the field of mixed methods research provide/dictate a theory of mixed methods adoption or are the
“right” practices those that researchers/disciplines adopt broadly? Who determines what mixed methods
adoption is? How is it determined?
• How can the field of mixed methods research learn/adapt based on the practices that researchers do (and
do not) adopt?
• How stable are practices? How do they change over time? How can the field be more effective at helping to
change some practices?
• How does the adoption of mixed methods practices predict the transparency of a report?
• How does the adoption of practices predict/relate to/get into the way of the quality of a mixed methods
study?
• How do accepted mixed methods practices help researchers think about doing mixed methods differently?
When does it help? When does it get in the way?
• Why are so many fascinated by prevalence studies? How does thinking about adoption as more than just
prevalence impact our interest in reviewing authors’ practices?
• If we had a quality measure of adoption, what could we learn by tracking changes in adoption over time?
What practices come and go out of fashion?
• What indicators of mixed methods adoption are missing?

Fetters, 2020). In addition, writing doctoral dissertation abstracts may be informed by institutional
norms, disciplinary expectations, and guidance received by advisors and faculty mentors; and
therefore, some of the mixed methods elements may not be included in the abstracts despite their
importance in a mixed methods study. Furthermore, some potential elements such as inclusion of a
joint display as an example of mixed methods integrated results would not be possible in the
context of an abstract. The doctoral dissertation abstracts, however, provide insight into graduate
students’ mixed methods practices and their abstract schemes reflecting their disciplines.
We assumed and conceptualized mixed methods adoption as a unidimensional latent construct
based on the mixed methods literature; and this assumption may have underlying limitations in
this study. This conceptualization emphasizes interpretability but does not circumvent the
complexity of mixed methods adoption. Therefore, it is crucial to point out the difficulties in
efficiently measuring mixed methods adoption. As presented in the results section (see Table 5),
across different item response theory models (i.e., Rasch, 2PL, and multidimensional) common
model fit indices indicated adequate (e.g., Root Mean Square Error of Approximation [RMSEA])
and sometimes poor performance (Comparative Fit Index [CFI] and Tucker–Lewis Index [TLI]).
Despite the strong alignment of these item level results to mixed methods research theory overall,
the root of this performance could stem from our use of dissertation abstracts as cases, our
conceptualization of mixed methods adoption as a latent construct, limitations with the item
response theory models, or other sources.
18 Journal of Mixed Methods Research 0(0)

Given these somewhat contradictory findings (i.e., poor model performance but strong the-
oretical support of item level results), we encourage future studies to continue exploration of
measurement models for mixed methods adoption, items to include in these models, various
scoring approaches and conceptualization of mixed methods adoption such as an index instead of
a unidimensional latent construct. Additionally, doctoral dissertation abstracts were methodo-
logically suitable for our examination of the degree of mixed methods adoption, but future
research should consider new settings to replicate this study including with peer-reviewed articles,
grant applications, or full-text doctoral dissertations.

Acknowledgments
We would like to thank reviewers, associate editor, and editors for their constructive feedback.

Declaration of Conflicting Interests


The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or
publication of this article.

Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.

ORCID iDs
Sinem Toraman Turk  https://orcid.org/0000-0002-5837-4414
Vicki L. Plano Clark  https://orcid.org/0000-0002-9709-7982

Supplemental Material
Supplemental material for this article is available online.

References
Alise, M., & Teddlie, C. (2010). A continuation of the paradigm wars? Prevalence rates of methodological
approaches across the social/behavioral sciences. Journal of Mixed Methods Research, 4(2), 103–126.
https://doi.org/10.1177/1558689809360805
Bazeley, P. (2015). Adoption of mixed methods approaches to research by management researchers. In:
Proceedings of the 14th European Conference On Research Methodology for Business and Man-
agement Studies (Ecrm 2015), Valletta, Malta, 11–12 June 2015, pp. 34–40.
Bazeley, P. (2016). Mixed or merged? Integration as the real challenge for mixed methods. Qualitative
Research in Organizations and Management: An International Journal, 11(3), 189–194. https://doi.org/
10.1108/QROM-04-2016-1373
Bazeley, P. (2018). Integrating analyses in mixed methods research. Sage.
Belcher, W. L. (2009). Writing your journal article in 12 weeks: A guide to academic publishing success.
Sage.
Bryman, A. (2006). Integrating quantitative and qualitative research: How is it done? Qualitative Research,
6(1), 97–113. https://doi.org/10.1177/1468794106058877
Coates, A. (2021). The prevalence of philosophical assumptions described in mixed methods research in
education. Journal of Mixed Methods Research, 15(2), 171–189. https://doi.org/10.1177/
1558689820958210
Coyle, C. E., Schulman-Green, D., Feder, S., Toraman, S., Prust, M. L., Plano Clark, V. L., & Curry, L.
(2018). Federal funding for mixed methods research in the health sciences in the United States: Recent
Toraman Turk et al. 19

trends. Journal of Mixed Methods Research, 12(3), 305–324. https://doi.org/10.1177/


1558689816662578
Creswell, J. W. (2009). Mapping the field of mixed methods research [Editorial]. Journal of Mixed Methods
Research, 3(2), 95–108. https://doi.org/10.1177/1558689808330883
Creswell, J. W., & Plano Clark, V. L. (2007). Designing and conducting mixed methods research (1st ed.). Sage.
Creswell, J. W., & Plano Clark, V. L. (2011). Designing and conducting mixed methods research (2nd ed.).
Sage.
Creswell, J. W., & Plano Clark, V. L. (2018). Designing and conducting mixed methods research (3rd ed.).
Sage.
de Ayala, R. J. (2009). The theory and practice of item response theory. The Guilford Press.
DeCuir-Gunby, J. T., & Schutz, P. A. (2017). Developing a mixed methods proposal: A practical guide for
beginning researchers. Sage.
Denscombe, M. (2008). Communities of practice: A research paradigm for the mixed methods approach.
Journal of Mixed Methods Research, 2(3), 270–283. https://doi.org/10.1177/1558689808316807
Fàbregues, S., Mumbardó-Adam, C., Escalante-Barrios, E. L., Hong, Q. N., Edelstein, D., Vanderboll, K., &
Fetters, M. D. (2022). Mixed methods intervention studies in children and adolescents with emotional
and behavioral disorders: A methodological review. Research in Developmental Disabilities, 126,
104239. https://doi.org/10.1016/j.ridd.2022.104239
Fetters, M. D. (2020). The mixed methods research workbook: Activities for designing, implementing, and
publishing projects. Sage.
Fetters, M. D., Curry, L. A., & Creswell, J. W. (2013). Achieving integration in mixed methods designs – principles
and practices. Health Services Research, 48(6), 2134–2156. https://doi.org/10.llll/1475-6773.12117
Fetters, M. D., & Freshwater, D. (2015). Publishing a methodological mixed methods research article
[Editorial]. Journal of Mixed Methods Research, 9(3), 203–213. https://doi.org/10.1177/
1558689815594687
Fetters, M. D., & Molina-Azorin, J. F. (2017). The Journal of Mixed Methods Research starts a new decade:
Perspectives of past editors on the current state of the field and future directions. Journal of Mixed
Methods Research, 11(4), 423–432. https://doi.org/10.1177/1558689817729476
Fujimura, J. (1996). Crafting science: A socio-history of the quest for the genetics of cancer. Harvard
University Press.
Fujimura, J. H. (1988). The molecular biological bandwagon in cancer research: Where social worlds meet.
Social Problems, 35(3), 261–283. https://doi.org/10.2307/800622
Granikov, V., Hong, Q. N., Crist, E., & Pluye, P. (2020). Mixed methods research in library and information
science: A methodological review. Library & Information Science Research, 42(1), 101003. https://doi.
org/10.1016/j.lisr.2020.101003
Greene, J. C. (2007). Mixed methods in social inquiry. Wiley.
Hesse-Biber, S., & Johnson, R. B. (Eds.), (2015). The Oxford handbook of multimethod and mixed methods
research inquiry. Oxford University Press.
Howell Smith, M. C., & Shanahan Bazis, P. (2021). Conducting mixed methods research systematic
methodological reviews: A review of practice and recommendations. Journal of Mixed Methods
Research, 15(4), 546–566. https://doi.org/10.1177/1558689820967626
Ivankova, N. V., & Kawamura, Y. (2010). Emerging trends in the utilization of integrated designs in the
social, behavioral, and health sciences. In A. Tashakkori & C. Teddlie (Eds.), Sage handbook of mixed
methods in social and behavioral research (2nd ed., pp. 581–611). Sage.
Johnson, B., & Gray, R. (2010). A history of the philosophical and theoretical issues for mixed methods
research. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral
research (2nd ed., pp. 69–94). Sage.
Maxwell, J. A. (2016). Expanding the history and range of mixed methods research. Journal of Mixed
Methods Research, 10(1), 12–27. https://doi.org/10.1177/1558689815571132
20 Journal of Mixed Methods Research 0(0)

McKim, C. (2017). The value of mixed methods research: A mixed methods study. Journal of Mixed Methods
Research, 11(2), 202–222. https://doi.org/10.1177/1558689815607096
Mertens, D. M., Bazeley, P., Bowleg, L., Fielding, N., Maxwell, J., Molina-Azorin, J. F., & Niglas, K.
(2016). Expanding thinking through a kaleidoscopic look into the future implications of the
mixed methods international research association’s task force report on the future of mixed
methods. Journal of Mixed Methods Research, 10(3), 221–227. https://doi.org/10.1177/
1558689816649719
Molina-Azorin, J. F., & Fetters, M. D. (2016). Mixed methods research prevalence studies: Field-specific
studies on the state of the art of mixed methods research. Journal of Mixed Methods Research, 10(2),
123–128. https://doi.org/10.1177/1558689816636707
Morgan, D. L. (2007). Paradigms lost and pragmatism regained: Methodological implications of combining
qualitative and quantitative methods. Journal of Mixed Methods Research, 1(1), 48–76. https://doi.org/
10.1177/2345678906292462
Morgan, D. L. (2022). Who is on the bandwagon? Core and periphery in mixed methods research. Journal of
Mixed Methods Research, 17(2), 135–142. https://doi.org/10.1177/15586898221096319
O’Cathain, A., Murphy, E., & Nicholl, J. (2007). Integration and publications as indicators of ‘‘yield’’ from
mixed methods studies. Journal of Mixed Methods Research, 1(2), 147–163. https://doi.org/10.1177/
1558689806299094
O’Cathain, A., Murphy, E., & Nicholl, J. (2008). The quality of mixed methods studies in health services
research. Journal of Health Services Research and Policy, 13(2), 92–98. https://doi.org/10.1258/jhsrp.
2007.007074
Plano Clark, V. L. (2010). The adoption and practice of mixed methods: U.S. Trends in federally funded
health-related research. Qualitative Inquiry, 16(6), 428–440. https://doi.org/10.1177/
1077800410364609
Plano Clark, V. L., & Ivankova, N. V. (2016). Mixed methods research: A guide to the field. Sage.
Pluye, P., & Hong, Q. N. (2014). Combining the power of stories and the power of numbers: Mixed methods
research and mixed studies reviews. Annual Review of Public Health, 35(1), 29–45. https://doi.org/10.
1146/annurev-publhealth-032013-182440
R Development Core Team. (2018). R: A language and environment for statistical computing: R Foundation
for Statistical Computing. https://www.R-project.org
Rogers, E. M. (2003). Diffusion of innovations. Free Press.
Ross, A., & Onwuegbuzie, A. J. (2010). Mixed methods research design: A comparison of prevalence in
JRME and AERJ. International Journal of Multiple Research Approaches, 4(3), 233–245. https://doi.
org/10.5172/mra.2010.4.3.233
Tashakkori, A., Johnson, R. B., & Teddlie, C. (2021). Foundations of mixed methods research:
Integrating quantitative and qualitative approaches in the social and behavioral sciences (2nd
ed.). Sage.
Tashakkori, A., & Teddlie, C. (Eds.), (2003). Handbook of mixed methods in social & behavioral research
(1st ed.). Sage.
Tashakkori, A., & Teddlie, C. (Eds.), (2010a). Sage handbook of mixed methods in social & behavioral
research (2nd ed.). Sage.
Tashakkori, A., & Teddlie, C. (2010b). Epilogue: Current developments and emerging trends in integrated
research methodology. In A. Tashakkori & C. Teddlie (Eds.), Sage handbook of mixed methods in social
& behavioral research (2nd ed., pp. 803–826). Sage.
Toraman, S. (2021). How recent doctorates learned about mixed methods research through sources: A mixed
methods social network analysis study. [Doctoral dissertation, University of Cincinnati]. OhioLINK
Electronic Theses and Dissertations Center. https://rave.ohiolink.edu/etdc/view?acc_num=
ucin1637150105919799
Toraman Turk et al. 21

Toraman, S., Cox, K., Plano Clark, V. L., & Dariotis, J. K. (2020). Graduate students’ current
practices for writing a mixed methods research study abstract: An examination of doctoral
dissertation and master’s thesis abstracts in the ProQuest Dissertations and Theses Global TM
database. International Journal of Multiple Research Approaches, 12(1), 110–128. https://doi.
org/10.29034/ijmra.v12n1a4
Younas, A., Pedersen, M., & Tayaben, J. L. (2019). Review of mixed methods research in nursing. Nursing
Research, 68(6), 464–472. https://doi.org/10.1097/NNR.0000000000000372

You might also like