0% found this document useful (0 votes)
13 views

One Size Doesn T Fit All Lessons From Interaction Analysis On Tailoring Open

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

One Size Doesn T Fit All Lessons From Interaction Analysis On Tailoring Open

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

Received: 21 October 2021 | Accepted: 28 July 2022

DOI: 10.1111/bjso.12568

S P E C I A L S E C T ION PA P E R

‘One size doesn't fit all’: Lessons from interaction


analysis on tailoring Open Science practices to
qualitative research

Bogdana Huma1 | Jack B. Joyce2

1
Vrije Universiteit Amsterdam, Amsterdam, The
Netherlands Abstract
2
University of Oxford, Oxford, UK The Open Science Movement aims to enhance the sound-
Correspondence ness, transparency, and accessibility of scientific research, and
Bogdana Huma, Department of Language,
Literature, and Communication, Faculty of at the same time increase public trust in science. Currently,
Humanities, Vrije Universiteit Amsterdam, Open Science practices are mainly presented as solutions to
De Boelelaan 1105, 1081 HV, Amsterdam,
Netherlands. the ‘reproducibility crisis’ in hypothetico-­deductive quanti-
Email: [email protected]
tative research. Increasing interest has been shown towards
exploring how these practices can be adopted by qualitative
researchers. In reviewing this emerging body of work, we
conclude that the issue of diversity within qualitative re-
search has not been adequately addressed. Furthermore, we
find that many of these endeavours start with existing solu-
tions for which they are trying to find matching problems to
be solved. We contrast this approach with a natural incorpo-
ration of Open Science practices within interaction analysis
and its constituent research traditions: conversation analysis,
discursive psychology, ethnomethodology, and membership
categorisation analysis. Zooming in on the development of
conversation analysis starting in the 1960s, we highlight how
practices for opening up and sharing data and analytic think-
ing have been embedded into its methodology. On the basis
of this presentation, we propose a series of lessons learned
for adopting Open Science practices in qualitative research.

This is an open access article under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs License, which permits use and
distribution in any medium, provided the original work is properly cited, the use is non-commercial and no modifications or adaptations are made.
© 2022 The Authors. British Journal of Social Psycholog y published by John Wiley & Sons Ltd on behalf of British Psychological Society.

1590 | 
wileyonlinelibrary.com/journal/bjso Br J Soc Psychol. 2023;62:1590–1604.
20448309, 2023, 4, Downloaded from https://bpspsychub.onlinelibrary.wiley.com/doi/10.1111/bjso.12568 by Cochrane Philippines, Wiley Online Library on [03/04/2024]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
INTERACTION ANALYSIS AND OPEN SCIENCE    | 1591

K EY WOR DS
conversation analysis, discursive psychology, interaction analysis, open
data, open science, preregistration, qualitative research, replication,
reproducibility

I N T RODUC T ION

Trusting the outcomes of scientific research is a fundamental prerequisite for scientific progress. In
the last few decades, the credibility of scientists and their findings has waned due to prominent cases
of fraud, such as that of the social psychologist Diederik Stapel, who published over 50 articles using
invented data (Levelt Committee et al., 2012), and due to the failure to replicate well-­k nown effects,
like ‘ego-­depletion’ (Hagger et al., 2016). In response, the scientific community, with psychology at the
forefront, has started to promote practices that increase the transparency and accountability of research
processes. Widely known as Open Science (OS), these practices preponderantly address issues germane
to hypothetico-­deductive methodologies associated mainly with quantitative research. While there have
been some attempts to adapt OS to qualitative methodologies, its uptake by qualitative researchers,
including social psychologists, seems to be lagging behind.
In the first part of this article, we will argue that the ostensibly moderate adoption of OS by qualita-
tive social psychologists is partially the result of an insufficient fit between the OS practices that are cur-
rently being promoted and qualitative methodologies. We will attempt to demonstrate that, in its current
form, OS is mainly geared towards solving problems inherent to quantitative research. While attempts
have been made to adapt OS practices to qualitative methodologies, we argue that this approach is sub-
optimal because OS practices can often not be satisfactorily adjusted to address issues within qualitative
social psychological research.
In the second part of the article, an alternative course of action is proposed: instead of adapt-
ing mainstream OS practices, qualitative researchers could develop tailored procedures uniquely
equipped to deal with the challenges they are facing. To illustrate this organic development of OS
practices, we will present the case of Interaction Analysis (IA), a set of approaches comprising
discursive psychology, conversation analysis, ethnomethodology, and membership categorization
analysis that have been employed in qualitative social psychological research for over 35 years (e.g.
Antaki, 1988; Billig, 1987; Edwards & Mercer, 1987; Potter & Wetherell, 1987). These approaches
already include OS principles and procedures that are embedded in their methodology and that are
designed to address anticipated limitations associated with interaction analytic research. Thus, we
advance the argument that interaction analysts are already practising OS in forms that are not yet
on the OS radar. The article concludes with a series of ‘lessons learned’ from IA for expanding OS
practices in social psychological research.

BR I EF OV ERV I E W OF T H E OPE N S CI E NC E MOV E M E N T

‘Open Science’ (also ‘Open Research/Scholarship’) refers to a set of practices aimed at enhancing
the transparency, efficiency and accessibility of scientific research processes and products, with
the aim of building public trust in scientists and scientific findings (cf. Vicente-­Saez & Martinez-­
Fuentes, 2018; see also the editorial of this special section). While Open Science practices have
been advocated for by sociologists of science for over 40 years (e.g. Merton, 1979), the Open Science
Movement (OSM) developed momentum in psychology after a series of large-­scale replication pro-
jects failed to reproduce many of the significant results reported in a number of experimental and
correlational studies published in prominent scientific journals (Camerer et al., 2018; Open Science
Collaboration, 2015). This set the OSM on a path towards identifying and remedying issues that
can threaten the trustworthiness and robustness of experimental and correlational research and
20448309, 2023, 4, Downloaded from https://bpspsychub.onlinelibrary.wiley.com/doi/10.1111/bjso.12568 by Cochrane Philippines, Wiley Online Library on [03/04/2024]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
1592 |    HUMA and JOYCE

their findings. In a prominent manifesto for Open Science, Munafò et al. (2017) identified a range
of problematic practices, such as failing to control for researcher or publication biases, designing
low-­quality studies that are underpowered or lacking sufficient control, and scouring datasets in
search for statistically significant results. Similar endeavours followed in other fields such as com-
munication studies (Dienlin et al., 2021), clinical psychology (Tackett et al., 2019) and neuroimaging
(Gorgolewski & Poldrack, 2016). Through these undertakings, the OSM has been configured not
by design, but fortuitously to deal with deficiencies that threaten the credibility of hypothetico-­
deductive experimental and correlational research leading to the wide use of the labels ‘reproducibil-
ity crisis’ or ‘replicability crisis’ to refer to the range of problems that OS practices can address. Thus,
the vocabulary of quantitative research methodology, which often relies on positivist epistemologi-
cal assumptions, has ended up dominating the discourse of the OSM (Nelson et al., 2021) and now
sets the agenda for its advancement. As we will argue in the next section, these assumptions often
render Open Science practices, that are predicated on them, difficult to reconcile with qualitative
methodologies.

OPE N S CI E NC E A N D QUA L I TAT I V E M E T HOD OL OGI E S

Qualitative methodologies are often described in comparison to or in opposition with quantitative


methodologies as approaches that rely on the examination of words instead of numbers (Braun &
Clarke, 2013; Silverman, 2011). However, upon reflection, it becomes clear that even this minimal crite-
rion fails to adequately distinguish between the two orientations as, for example, many questionnaires
employ textual labels to accompany numerical scales (cf., Silverman, 2011) and many experimental stud-
ies rely on insights generated through qualitative methods, such as debriefing interviews, to make sense
of quantitative findings (Stainton Rogers, 2011).
Qualitative approaches are enormously diverse. For example, what we understand by ‘words’ as
data varies from, for example short responses provided by participants to interviewers' questions, to
lengthy naturally occurring conversations that participants conducted independent of the research-
er's intervention. Moreover, qualitative methodologies differ with respect to what they ‘do’ with
their data (Madill et al., 2000); that is, the methods employed to analyse, interpret, or otherwise
make sense of researcher observations or participants' accounts and the inferences drawn about
the ‘world beyond the words’. Finally, qualitative research methods differ immensely in terms of
their ontological and epistemological assumptions. In fact, some methods are so radically different
in their treatment of language and the knowledge derived from studying it, that they do not fit
within either the qualitative or the quantitative ‘box’ (Stokoe, 2020). In their editorial on Innovating
qualitative research methods, LaMarre and Chamberlain (2022) go one step further and argue that the
heterogeneity of qualitative research methodologies is under threat from an impetus for standard-
ization, for example through the introduction of journal article reporting standards for qualitative
research (Levitt et al., 2018). Echoing Clarke (2021), the authors highlight that this trend can benefit
mainstream qualitative approaches while leaving behind non-­t raditional ones. Consequently, social
psychologists who employ non-­traditional methods, may need to do additional work to explain
and legitimize their approaches and find ways to represent their research in line with standards
that might not be entirely applicable to them (see also the discussion on qualitative preregistration
below). As a consequence, social psychologists could be demotivated to choose less conventional
methods over established ones, leading to a qualitative research landscape dominated by interview-­
based research (LaMarre & Chamberlain, 2022).
To sum up, despite the acknowledgement that the distinction between quantitative and qualita-
tive methodologies is unhelpful and not straightforward (Silverman, 2017), it continues to be used
within social and behavioural sciences, including in discussions of OS (e.g. Dienlin et al., 2021;
Tamminen & Poucher, 2018; Sukumar & Metoyer, 2019). The attempts to incorporate OS within
qualitative methodologies, which we review below, are thus complicated by being defined in terms
20448309, 2023, 4, Downloaded from https://bpspsychub.onlinelibrary.wiley.com/doi/10.1111/bjso.12568 by Cochrane Philippines, Wiley Online Library on [03/04/2024]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
INTERACTION ANALYSIS AND OPEN SCIENCE    | 1593

which are not always directly applicable and that obscure or stifle the diversity within qualitative
methodologies.

Adoption of Open Science by qualitative researchers

To date, qualitative methodologies have occupied a peripheral position in the OSM. Unlike quantita-
tive research that has come under fire across a range of disciplines from medical to social sciences for
an ostensible decrease in the trustworthiness of scientific results (Ritchie, 2020), the dependability of
qualitative research has not been called into question yet. However, calls for qualitative researchers to
adopt OS practices have been issued (Anczyk et al., 2019; Karhulahti, 2022; Sukumar & Metoyer, 2019).
In what follows, we review existing developments to implement replications, preregistration and data
sharing practices within qualitative methodologies.

Reproducibility and replicability of qualitative research

To the best of our knowledge, there have not been any attempts at conducting systematic checks
of the reproducibility of findings from qualitative research, or at setting up large-­scale replications
like the Reproducibility Project (Open Science Collaboration, 2012). Admittedly, these are not core
issues associated with the methodological integrity of qualitative research (Levitt et al., 2018), es-
pecially since not all epistemologies advance the same understanding of reproducibility or even aim
to yield reproducible findings (Tamminen & Poucher, 2018). A core feature of some qualitative
techniques, such as autoethnographies, recognize that analysts have a central role in the research
process and generate their own memories and understandings of what they have seen, heard and
felt (Hammersley, 2010), thus their unique world view cannot be bracketed or replicated as such.
Instead, researchers are encouraged to practise transparency and reflexivity and bring to the fore-
front how their life experiences, values, and membership in various identity categories shape their
interpretation of the data.
Still, others hold the view that replications, if redefined to be inclusive of qualitative methodolo-
gies, can help to articulate and compare researcher biases, act as an alternative to transferring find-
ings between similar contexts, and compare research methods amongst other benefits (see Sukumar &
Metoyer, 2019). For example, Sukumar and Metoyer (2019) claim that some methods within approaches
such as phenomenology, sociolinguistics and critical genres would be amenable to replications. The au-
thors also put forward a new definition of replications which highlights that they need to be conducted
not by the original research team, but by an independent group. The latter would duplicate one or more
aspects of the original study to conduct an interpretative comparison of the original and the replication.
While this extended understanding of replication seems appealing, it does not yet encompass all quali-
tative approaches and it might give the false impression that the methodological integrity of qualitative
research is at stake. It remains to be seen whether researchers accept the challenge of trying to replicate
qualitative findings in accord with the above suggestions, but our sense is that, even in its redefined
form, replication rarely addresses genuine concerns inherent to most qualitative methodologies.

Preregistering qualitative studies

One area of OS where qualitative researchers have made strides is preregistration. Initially, the idea of
preregistering qualitative projects was called into question on account that qualitative studies neither
benefit from, nor are amenable to having their design laid out in advance (Haven & Van Grootel, 2019).
These misgivings have been judiciously dispelled (Haven & Van Grootel, 2019) and to date there are
several preregistration and pre-­analysis templates available for qualitative research (e.g. Haven & Van
20448309, 2023, 4, Downloaded from https://bpspsychub.onlinelibrary.wiley.com/doi/10.1111/bjso.12568 by Cochrane Philippines, Wiley Online Library on [03/04/2024]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
1594 |    HUMA and JOYCE

Grootel, 2019; Piñeiro & Rosenblatt, 2018). The form designed by Haven et al. (2020) incorporates the
views of researchers representing several different qualitative traditions and, thus, is supposed to be fit-
ted to a wide range of qualitative approaches. However, as we show below, even this carefully thought-­
out template still falls short of properly accommodating all qualitative research methods.
Despite the effort to be inclusive and flexible, a glance at the template reveals that the choice in
terminology accommodates some qualitative approaches, but not others. Many sections of the form
include response alternatives listed from amongst established qualitative research methods with an ad-
ditional ‘Other’ option that allows researchers, whose approach is not pre-­listed, to include it in the
template. However, this format ends up ‘othering’ less established qualitative methods and reinforcing
the status of mainstream ones. Furthermore, for some sections, this option is missing and researchers
are required to fit their research design into pre-­established linguistic categories, which might end up
mischaracterizing their study.
For example, in Section 13, researchers need to specify the study's ‘sample size’, meaning the num-
ber of ‘interviews/observations/focus groups’ that will be conducted. This item poses several diffi-
culties. First, not all qualitative approaches postulate a representational relationship between the data
and the social world which produced them. Second, for some approaches, like conversation analysis,
what constitutes a sample item is not clear. Conversation analysts collect and examine instances of
naturally occurring social interactions to identify patterns and regularities therein. To build such col-
lections, which can range from a few to hundreds of cases, conversation analysts often record tens or
hundreds of interactions from one or more settings, amounting to up to hundreds of hours of data.
Furthermore, a collection comprises diverse cases, including core, boundary, and deviant cases (Hoey &
Kendrick, 2018) and the size of a collection fluctuates throughout the analysis process. Simply reporting
a total case count obscures the diversity within a collection, since samples are supposedly composed of
homogenous entities. So, which of these numbers –­the number of interactions, the total hours of data,
or the number of cases included in the collection at a particular point in time –­is equivalent to the ‘sam-
ple size’? Probably none of these, especially since conversation analysis does not rely on quantification
to validate analytic findings. It is not the frequency with which a particular practice recurs within the
data, but its demonstrable patterned organization that constitutes the proof for its validity (Clayman &
Gill, 2004).
To conclude, we hope to have illustrated that the language used in one of the most popular
preregistration templates, developed specifically for qualitative methodologies falls short of accom-
modating some approaches, and may force researchers to misrepresent their studies or give up on
preregistering them. Thus, the template ends up favouring established methods. Over time, this can
create inequalities in the level of engagement with preregistration and other OS practices across
qualitative approaches. Let us emphasize: we are not arguing that preregistering a qualitative study
cannot or should not be done, just that, in the case of qualitative preregistration, ‘one size fits all’
may not be a felicitous solution.

Sharing qualitative data

An area of debate at the interface of the OSM and qualitative methodologies is the sharing and reusing
data. The principles of reusing data align with the OSM (enhancing transparency and efficiency and
accessibility of scientific processes and products) and are widely supported by participants (Mozersky,
Parsons, et al., 2020), yet there is resistance to openly sharing all qualitative data on epistemological
and ethical grounds (see Joyce et al., 2022, for a review of those grounds). The key arguments for a
more nuanced handling of sharing data within the OSM pertain to: (1) ensuring participant anonym-
ity, especially when qualitative data can be particularly intimate and fully informed consent about the
(possibly limitless) data reuse can be difficult to communicate to participants (Branney et al., 2019); (2)
possible errors and inconsistencies in datasets which might have significant consequences, such as a lack
of public trust in the scientific enterprise, or a tragic loss of life (Brown et al., 2018). On epistemological
20448309, 2023, 4, Downloaded from https://bpspsychub.onlinelibrary.wiley.com/doi/10.1111/bjso.12568 by Cochrane Philippines, Wiley Online Library on [03/04/2024]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
INTERACTION ANALYSIS AND OPEN SCIENCE    | 1595

grounds, objections have been raised on the basis of (3) qualitative data being usually highly contextual
(in both reflecting the researchers' beliefs, judgements, disciplinary assumptions and the environment/
time the data were collected) which can make it difficult or impossible to be meaningfully reinterpreted
by another researcher (Branney et al., 2019; Hammersley, 2010).
The complexity of (re)using qualitative data (e.g. visual, interview, observations) does not sim-
ply mandate a reflexive paradigm to the research, but warrants a rethink of whether and how such
data can ever be reused in secondary analyses. Moreover, the very distinction between primary
and secondary qualitative analyses has come under increased scrutiny ( Joyce et al., 2022). Clearly
these concerns, while not unique to qualitative data, cannot be solved via a narrow one-­size-­f its-­a ll
approach (cf. Joyce et al., 2022) which does not necessarily account for the breadth of qualitative
approaches and data.
We started this section by highlighting the diversity of approaches within qualitative social psy-
chological research and went on to illustrate how some of the practices that are currently being em-
ployed to enhance the transparency and trustworthiness of quantitative research are either insufficiently
adaptable or fall short of addressing issues germane to qualitative methodologies. In what follows, we
introduce an alternative approach to the implementation of OS within qualitative social psychological
research: the organic development, within IA, of OS practices that specifically address vulnerabilities
intrinsic to this qualitative approach.

I N T ER AC T ION A NA LY SIS A N D OPE N S CI E NC E

Interaction analysis incorporates discursive psychology, conversation analysis, ethnomethodology, and


membership categorization analysis, four highly compatible research traditions, employed across the social
sciences, with discursive psychology having been developed within psychology. We refer to IA not as a
unique approach, but an umbrella term to describe the four research traditions. While IA utilizes qualita-
tive data, it does not fit neatly into either the qualitative or the quantitative ‘box’ (Stokoe, 2020). Specifically,
interaction analysts work almost exclusively with recordings of naturally occurring social interactions
(Potter & Shaw, 2018), though conversation analysis can be combined with experimental and statistical
methods (de Ruiter & Albert, 2017). Analysts refrain from inferring what participants are thinking, feeling
or experiencing based on what they say or do (Humă, Alexander, et al., 2020). Instead, IA is designed to
excavate the actions accomplished by participants through linguistic, embodied and material resources.
Interaction analysis takes an inductive approach to examining human social interaction and theoriz-
ing its organizing principles. IA differs from other qualitative approaches by promoting ‘unmotivated
looking’, meaning that researchers leave aside any preconceptions about what they expect or want to
find and instead explore the data with an open mind (Psathas, 1990). Even once a research question
has been formulated, it can still be further adapted as the research progresses. To a scientist trained
in hypothetico-­deductive research, this practice looks suspiciously similar to HARKing (Albert & de
Ruiter, 2017). But far from jeopardizing the methodological integrity of a study, this practice ensures the
mutual adequacy between data and research aims (Humă, Alexander, et al., 2020).
Interaction Analysis is also compatible with a more directed ‘motivated looking’ approach, whereby
researchers collect and examine either data from certain environments such as sales encounters (Humă,
Stokoe, & Sikveland, 2020) or family meal-­t imes (Potter & Hepburn, 2020), or data featuring specific
social phenomena such as emotion (Weatherall & Stubbe, 2015) or racism (Burke & Demasi, 2020). Still
these focal interests are unencumbered by theoretical notions or hypotheses in search of support, thus
allowing interaction analysis to be data-­driven and to foster unexpected observations.
Another defining feature of interaction analysis consists in its almost exclusive reliance on the ex-
amination of recordings of naturally occurring social interactions such as calls to helplines, disputes
on public transport, political debates, medical consultations, family meals, to name only a few. These
encounters are subject to Hepburn and Potter's (2021) two tests for naturalistic data: (1) the encounter
should have taken place even without the involvement of the researcher and (2) the data should allow
20448309, 2023, 4, Downloaded from https://bpspsychub.onlinelibrary.wiley.com/doi/10.1111/bjso.12568 by Cochrane Philippines, Wiley Online Library on [03/04/2024]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
1596 |    HUMA and JOYCE

analysts to retrieve the original actions performed by the participants. Using this kind of data, discursive
psychologists have documented, for example how emotions such as crying are interactionally man-
aged (Hepburn & Potter, 2007), how derisive laughter can serve an argumentative purpose (Demasi &
Tileagă, 2020), and how displaying empathy can contribute to the achievement of specific institutional
outcomes (Ford et al., 2019).
Importantly, IA distinguishes itself from most qualitative approaches that rely on interpretative or
constructionist epistemologies in which the researcher plays an active role in making sense of par-
ticipants' accounts and co-­produces the data and meaning therein. Instead, IA adopts a members’
perspective and relies on the analyses and orientations that individuals display in and through their talk-­
in-­interaction through such practices as the next turn proof procedure (Edwards, 2004), or self-­repair
(Drew et al., 2013). Interaction analysts refrain from speculating about what people think, feel, know,
or understand and instead rely on what is demonstrable in the data and, thus, accessible to any natural
language user ( Jacobs, 1988). All interaction analytic observations need to be rigorously accounted for,
checked, and checkable in the analysis (see Schegloff, 1996).
Interaction Analysis encompasses four compatible research traditions. Conversation analysis, initi-
ated by Harvey Sacks (1992), focuses on two basic components of interaction (Clift, 2015): action, which
is what we do with the words we use (see Austin, 1962), for example making a complaint, telling a story,
apologizing, requesting and, sequence, which is the chronology and structure of how actions are per-
formed in and through talk and embodied conduct. Membership categorization analysis, developed by
Hester and Eglin (1997) based on their understanding of Sacks' writings and lectures (Sacks, 1992), ex-
plores the organization of categories and associated inferences. The third tradition, discursive psychol-
ogy, is interested in how individuals manage psychological topics such as attitudes, memory, identity, or
social influence in and as part of their daily lives (Edwards & Potter, 1992). Finally, ethnomethodology,
introduced by Harold Garfinkel (1967), excavates the methods individuals use to organize everyday
interactions and to make sense of their own and their interlocutors' conduct. Despite their specific foci,
these four approaches share an analytic outlook that does not, generally speaking, entertain acontextual
explanations of the phenomenon unless locally and demonstrably expressed.

Opening the methodological ‘black box’

Conversation analysts’ preoccupation with the rigour and transparency of their analytic enterprise tran-
spires from across early conversation analytic work (e.g. Sacks, 1984, 1992; Sacks et al., 1974). In the
lectures delivered by Harvey Sacks between 1964 and 1972 and collected by Gail Jefferson in the two-­
volume Lectures on Conversation (Sacks, 1992), two distinct, but interrelated aspects of reproducibility
transpire (Schegloff, 1992): the reproducibility (i.e., duplication) of research findings and the repro-
ducibility of the analytic endeavour. To address both aspects of reproducibility, conversation analysts
have developed several methodological procedures, which we outline below, and these are now widely
employed in interaction analytic research.
All interaction analytic empirical publications include extensive data extracts enabling readers to in-
dependently check all analytic claims. This is achieved by incorporating detailed transcripts within the
papers' analytic sections. Most transcripts are produced according to the conventions for transcribing
talk-­in-­interaction originally developed by Gail Jefferson (2004). Increasingly, for transcribing video
data, the conventions introduced by Lorenza Mondada (2018) have been adopted. These transcription
systems ensure that the key features of speech –­such as intonation, pitch, silences and overlaps –­and
of embodied conduct –­such as the start, trajectory, and end of body movement –­are accurately repre-
sented in the data extracts (Hepburn & Bolden, 2017). This practice not only allows authors to refer to
these details in their analyses to support their claims, but it also enables readers to verify these claims
independently. Furthermore, readers get access not only to what the authors chose to focus on in their
analyses, but to all other transcribed features of the scrutinized interaction, which allows them to enter-
tain alternative analyses of the data.
20448309, 2023, 4, Downloaded from https://bpspsychub.onlinelibrary.wiley.com/doi/10.1111/bjso.12568 by Cochrane Philippines, Wiley Online Library on [03/04/2024]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
INTERACTION ANALYSIS AND OPEN SCIENCE    | 1597

Interaction Analysis's preoccupation with rigour results in IA researchers mostly abstaining from
making analytic claims that are not backed up by evidence from the transcripts. This means not only
refraining from referring to events that are not included in the shown data, but also from explaining
participants' actions in terms of their psychologies (i.e., wants, needs, thoughts, emotions) or their social
identities (e.g. woman, doctor, child, husband), unless the latter have been made relevant and conse-
quential by the participants themselves (Schegloff, 2007). Instead, analysts rely on prior empirical find-
ings from published research to corroborate their observations.
In fact, since its inception, IA, and especially conversation analysis, has accumulated a large body
of coherent empirical results, much like what Kuhn (1962) would call ‘normal science’. Importantly, by
drawing on, checking and validating prior results in new datasets, IA allows for the ‘integrative replica-
tion’ (Freese & Peterson, 2017, p. 151) of these findings. This constitutes an informal replication pro-
cess, the by-­product of researchers' efforts to produce new knowledge by building on existing empirical
findings (Freese & Peterson, 2017).
In addition to the above-­described tools providing for ‘reproducible descriptions’ (Sacks, 1992, p.
11) of social phenomena and for the integrative replication of findings, IA also employs practices for
enhancing the intersubjective verifiability of analytic claims, prior to write-­up. This is achieved via
data sessions, wherein researchers show their data prior to publication, and in some cases, prior to
having any findings. These sessions offer a fertile ground for jointly interrogating data, for corrobo-
rating discoveries, for learning the method, and for retelling lay observations in a refined and precise
manner. The session follows a regular structure (see ten Have, 2007 for an extended description of
the structure), though the specifics might differ between research groups. The joint interrogation
of data adds a layer of rigour to the analytic process through the intersubjective substantiation of
analyses.

Opening-­up qualitative data

A tenet of the OSM, open data, is a principal way that scientific transparency can be achieved. Threats
to scientific trustworthiness stem from findings based on inaccessible data, for which it is not clear what
evidence has been used to arrive at the findings and so the reader (and public at large) must rely on the
word of the author. For interaction analysis, the transparent use of data is a staple of the method and a
consequence of the early scholars using what data they had available to them and sharing them amongst
the small community. Transparent use and reuse of data is aptly surmised by Sacks (1984, p. 27): ‘People
often ask me why I choose the particular data I choose. [..] And I am insistent that I just happened to
have it, it became fascinating, and I spent some time at it’.
Methodologically and historically, IA has preserved features which have made it well-­suited to
ensure open data (Albert et al., 2018). Data are often shared prior to, in publications and following
project completion. It is expected that IA publications will feature detailed transcriptions using ap-
propriate conventions. This not only allows findings to be independently verified by reviewers and
readers, and but it also enables scholars to reuse the data in their own analyses of the same or of
different phenomena. Technological advances and the ability to embed recordings in journal publi-
cations1 is beginning to raise those expectations, for example, IA articles that examine ‘online’ re-
cordings (e.g. YouTube, TikTok) have started to include links to original videos (e.g. Joyce et al., 2021).
Increasing data accessibility is an effort to formalize the central yet capricious practice of IA re-
searchers retaining and sharing the original recordings. Open data is thus not only a practical re-
quirement for publication, but a guiding principle through which incremental steps are taken to
continually improve data accessibility.
Sharing one's data and analysis is fundamental for IA and practising OS. This goes hand-­in-­
hand with documenting the research process and making it available for others to see (Tamminen &

1
See for example the journal Social Interaction at https://tidss​k rift.dk/socia​l inte​raction
20448309, 2023, 4, Downloaded from https://bpspsychub.onlinelibrary.wiley.com/doi/10.1111/bjso.12568 by Cochrane Philippines, Wiley Online Library on [03/04/2024]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
1598 |    HUMA and JOYCE

Poucher, 2018). IA does not typically use the terms ‘reproducibility’ and ‘replicability’ as they imply that
efforts to ensure transparency and fit with the OSM are exogenous to the science, but for IA these are
baked into its methodology.

CH A L L E NGE S FOR T H E OPE N S CI E NC E AGE N DA OF


I N T ER AC T ION A NA LY SIS

Highlighting how IA might usefully contribute to the discussion on OSM and qualitative methodolo-
gies has hopefully demonstrated that one size does not fit all when it comes to practising OS. In the
remainder of this article, we critically reflect on the challenges that interaction analysts, and to some
extent qualitative approaches at large, must still tackle to advance the OS agenda.

Expertise, technology and access

IA's development, focus, and engagement in OS practices has been shaped by the available technology
of the time. For example, CA began with the wider availability of consumer recording technologies al-
lowing researchers to replay, pause, and slow down recordings of social interaction, thus permitting a
very fine-­g rained approach to analysis. Importantly, recordings and transcripts were available to be
duplicated and physically shared which formed one of the key pillars of IA –­the impetus on sharing
data. The early practices for sharing data are not without criticism, which we discuss later in this section.
Over time, the development and availability of technology has allowed for materials to be securely
stored and widely shared in online repositories –­for example, the personal website of conversation ana-
lyst Emanuel Schegloff 2 or the CABank 3 at first, and then more recently the Open Science Framework,
and ResearchGate.

Expertise

The promise of these new and accessible systems has been huge, yet fulfilling that promise has required
adoption and investment by institutions and individuals. The UK Research and Innovation (UKRI)
released a concordat in 2016 stating that all publicly funded research data must be openly available and,
in a recent review, reemphasized that there should be as few restrictions as possible on data (UKRI,
2016). Despite this, concerns persist about the secure sharing of data, the expertise needed to choose an
appropriate platform, and the use of those platforms (Mozersky, Walsh, et al., 2020), especially consid-
ering data protection regulations in the United Kingdom and the European Union. For IA (and other
qualitative approaches), concerns about safeguarding data are often exacerbated by ethics committees
which impose restrictions to data collection and sharing rendering any (re)use or checking of data diffi-
cult if not impossible. To be clear, we are not arguing against safeguarding participants, but highlighting
that a lack of relevant expertise in unfamiliar disciplines and technologies –­for both ethic committees
and for researchers –­may inadvertently erect barriers to fully openly accessible data. So while funders
and the OSM push for open research data, the perception that IA (and qualitative) data are incredibly
sensitive and confidential means that for many studies data sharing becomes virtually impossible due
to ethical restrictions.
While qualitative researchers can formulate arguments to counter misconceptions and advocate
against overly restrictive or burdensome barriers, it is much more difficult to solve the problem of

2
https://www.sscnet.ucla.edu/soc/facul​t y/scheg​loff/
3
https://ca.talkb​a nk.org/
20448309, 2023, 4, Downloaded from https://bpspsychub.onlinelibrary.wiley.com/doi/10.1111/bjso.12568 by Cochrane Philippines, Wiley Online Library on [03/04/2024]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
INTERACTION ANALYSIS AND OPEN SCIENCE    | 1599

data splintering. In IA, as the field and technology have developed, new tools, transcription sys-
tems and software have been created. This has had undoubted benefits for research, but the hidden
barrier of expertise especially in the face of the OSM are yet to be addressed. Those barriers may
include, for example additional training, software and time, all of which disproportionately im-
pact early career researchers particularly interdisciplinary qualitative early career researchers, which
might rely on diverse and varied datasets. Accessibly preparing data is a remedy, but not a panacea,
and instead shifts the burden to the data sharer. Further critical discussion on where the burden of
data preparation lies is needed (but see Pownall et al., 2021 on the costs of OS practices for early
career researchers).

Technology

It is not only a lack of relevant expertise which impedes IA's engagement in the OSM but also technol-
ogy. While IA has been quick to adapt new technologies for data collection and analysis, other areas
have lagged behind. Many traditional outlets such as academic journals are ill-­equipped to host mul-
timedia materials, so although IA typically presents data alongside the analysis, this usually includes
anonymized transcripts and not audio or video recordings. Consequently, the relationship between the
discipline-­specific practices and solutions, and the available infrastructure can both help and hinder
engagement with the OS. For instance, constraints on word count and formatting in journals might
result in a limited description of the steps taken to arrive at the analytic findings, which, in the interest
of transparency for readers outside of the discipline, can pose a challenge to understand the processes
through how the researcher arrived at those findings.
These problems relating to infrastructure impact the consistency in presentation of data. With only
the ability to present data as textual representations, the reviewer and reader must trust, without the
ability to check, that those representations accurately reflect the recordings. The problem of consis-
tency and verification pervades the storing and sharing of data in IA. Albert and Hofstetter (in prep. a)
describe this issue and suggest some common practices for managing data –­such as, how to catalogue
and label data, and what tools exist to analyse large corpora. We agree with their argument that ‘any
prescriptive technological recommendations would quickly become redundant’ (para 1), so rather than
give exact solutions and prescribe concrete guides to data management, we would suggest that, in the in-
terest of data reuse and ensuring participant confidentiality, that data corpora be consistently, accurately,
and fully described with protocols narrating the technical choices made in the process of data storage.

Access to data

Interaction Analysis can be incredibly time-­intensive and, coupled with research in harder-­to-­access
environments, it can prove out of reach for early career and marginalized researchers who are restricted
by time and precarious employment. It is a well-­made argument that access to datasets benefits both
these groups (see Jepson et al., 2017) and participants and institutions that may be burdened by the in-
vestment necessary to engage in research. Transparency and access by way of the OSM eases the time
and cost burden on researchers for gaining access, collecting, and transcribing/preparing data. Simply
put, widening access to data is one way that a research community can support the work of early career
and marginalized scholars and foster an environment where data are continually scrutinized to check
and expand upon prior work.
The black box of IA data has traditionally been opened following project completion with data cor-
pora stored in physical and online repositories, mirroring the practices of other disciplines. However,
the difficulty of relying on this as the sole method of ensuring open data is that corpora are often spo-
radically organized and inconsistently labelled (Albert & Hofstetter, in prep. a), hindering the efficiency
20448309, 2023, 4, Downloaded from https://bpspsychub.onlinelibrary.wiley.com/doi/10.1111/bjso.12568 by Cochrane Philippines, Wiley Online Library on [03/04/2024]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
1600 |    HUMA and JOYCE

and reusability of purportedly open data. One such example of the pitfalls of data access has recently
been reflected on within the conversation analytic community in relation to its handling of its ‘classic
data’ corpus. For conversation analysis, ‘classic data’ refers to data corpora central to the early work in
the field and which have been widely reused and reanalysed in the course of the field's development.
The problem lies in issues of openness and representation. For openness, despite there being alleged
unrestrained access across the conversation analytic community, the reality is that due to ethical con-
cerns, only a small subset of scholars has access to the corpus. This is an issue which is reviewed in
detail by Albert and Hofstetter (in prep. b), who explore the problem of this creating an in-­and out-­
group sense within the community. Moreover, these data collected in a particularly middle-­class area
on the west coast of the United States in the 1960s and 1970s have favoured a particular view of lan-
guage use which has pervaded the history of conversation analysis. Recent efforts by groups such as the
Ethnomethodology and Conversation Analysis for Racial Justice (EMCA4RJ) have sought to address
the lack of diversity in data by supporting the access to data in different languages and from traditionally
minoritized communities in IA (Sciubba et al., 2021).
Data accessibility, whilst the gold standard for transparency still presents problems when improp-
erly handled. To be clear, we are arguing that data access in IA has afforded rich opportunities for
subsequent analyses and has furnished IA with reproducibility and replicability baked into the process.
However, ad-­hoc sharing practices controlled by a small group in the community and an over reliance
on a limited datasets can adversely impact the OSM efforts.

L E S S ONS L E A R N E D

Open Science unquestionably benefits psychology. To date, it has gone some way to redeem the cred-
ibility of psychological research in the wake of the far-­reaching ‘reproducibility crisis’. This article has
shown that, as it stands, the OS agenda has been mainly centred around challenges faced by quantita-
tive research. We argued that current efforts to include qualitative research in the OSM by adjusting
OS practices to fit with qualitative methodologies fall somewhat short of their goal. Specifically, these
attempts do not properly address the heterogeneity of qualitative methodologies and do not provide
solutions to their unique challenges. We contrasted this ‘one size fits all’ approach with the organic in-
corporation of OS practices within interaction analysis. In this last section, we put forward three lessons
that can be learned from how interaction analysts developed OS practices that mitigate the challenges
inherent to this methodology.

Lesson 1: ‘Look inward’

Interaction analysis has developed procedures to deal with issues related to reproducibility, transpar-
ency, and data sharing that were identified as threatening the integrity of the method. The transparent
use and reuse of data in IA is grounded in the early development of the approach because the research-
ers ‘happened to have it’ (Sacks, 1984, p. 27) and so other researchers could ‘make of it what they could’
(p. 26). The way that IA approaches examine and handle their data –­by not entertaining acontextual
explanations of phenomena unless that context is demonstrably made relevant in the interaction, and by
sharing data during and after the analysis –­underscores the principles of verification and transparency.
Even if this may be unique to IA, other qualitative approaches may have developed similar safeguarding
procedures that are compatible with their epistemological assumptions. The emphasis on data accessi-
bility throughout the research process has benefitted not only the individual researchers via the support
for arriving at and verifying their findings, but also the wider academic community via opportunities
for integrative replication and reproducibility. It is worth acknowledging these practices and relating
them to the OS agenda.
20448309, 2023, 4, Downloaded from https://bpspsychub.onlinelibrary.wiley.com/doi/10.1111/bjso.12568 by Cochrane Philippines, Wiley Online Library on [03/04/2024]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
INTERACTION ANALYSIS AND OPEN SCIENCE    | 1601

Lesson 2: ‘Look outward’

Interaction Analysis's distinct epistemology has influenced the natural incorporation of Open Science
practices by its practitioners, and whilst many of these are unique to the approach, they still might use-
fully inform or be adopted by other methods to enhance their transparency and trustworthiness. One
such example is given by Albert and de Ruiter (2017) who advocate for legal HARKing which experi-
mental psychologists with an interest in studying social encounters could adopt. This means that before
designing an experiment, the researchers may want to consider examining some naturally occurring
data to ground their theorizing and hypothesizing in real life examples. The lesson here is that quali-
tative methodologies can also offer practical solutions to credibility challenges faced by quantitative
research, not only the other way around.

Lesson 3: ‘Look ahead’

While there are a number of Open Science practices used amongst qualitative approaches barriers still
exist in a few areas. For instance, technology and knowledge lag behind the willingness of qualitative
researchers to engage in Open Science. New technologies have proven to be immensely beneficial for
verifying findings, making data accessible, and promoting trust in the scientific process. Whilst new
technologies have arrived to provide solutions, the resources required to learn and adapt to these sys-
tems present barriers to the fullest engagement with Open Science. As we discussed throughout this
paper, many of the barriers to Open Science are solved with tools and concepts designed for quan-
titative research which leaves qualitative researchers to figure out what problems might be solved by
these solutions rather than the other way around. Thus expertise, from funders to ethic committees to
repository administrators, is necessary to avoid imposing ill-­f itted solutions for Open Science barriers
which do not exist for qualitative research and vice versa, for finding solutions to barriers unique to it.
Taken together, these three lessons hopefully demonstrate the potential of qualitative social psycho-
logical researchers not only to embrace, but also to drive innovation within OS. While it may appear at
times that qualitative methodologies are incompatible with OS, we have aimed to highlight that these
are cases of specific qualitative approaches being at odds with specific OS practices, but not with un-
derlying OS principles. Finally, we have shown that some qualitative social psychologists –­those who
have been using interaction analytic methods –­have already been practicing OS for several decades. We
hope this will inspire social psychologists working with other methodological approaches to identify OS
practices within their own field.

AU T HOR C ON T R I BU T ION S
Bogdana Huma: Conceptualization; writing –­original draft; writing –­review and editing. Jack B.
Joyce: Conceptualization; writing –­original draft; writing –­review and editing.

AC K NOW L E D G E M E N T S
We are grateful to the York St John University ReproducibiliTEA Journal Club, and in particular to
Natalie Smith, Jelena Mirkovic and Scott Cole, for inspiring us to start thinking about the compatibility
of Open Science and qualitative methodologies. We would also like to thank Tom Douglass for stimu-
lating discussions around the challenges of implementing Open Science.

C ON F L IC T OF I N T E R E S T
All authors declare no conflict of interest.

DATA AVA I L A BI L I T Y S TAT E M E N T


Data sharing is not applicable to this article as no datasets were generated or analysed during the cur-
rent study.
20448309, 2023, 4, Downloaded from https://bpspsychub.onlinelibrary.wiley.com/doi/10.1111/bjso.12568 by Cochrane Philippines, Wiley Online Library on [03/04/2024]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
1602 |    HUMA and JOYCE

ORC I D
Bogdana Huma https://orcid.org/0000-0003-0482-9580
Jack B. Joyce https://orcid.org/0000-0001-9499-1471

R EF ER ENCES
Albert, S., Albury, C., Alexander, M., Harris, M. T., Hofstetter, E., Holmes, E. J. B., & Stokoe, E. (2018). The conversa-
tional rollercoaster: Conversation analysis and the public science of talk. Discourse Studies, 20(3), 397–­424. https://doi.
org/10.1177/14614​45618​754571
Albert, S., & de Ruiter, J. (2017). Legal HARKing: Theoretical grounding in interaction research. Proceedings of Cognitive Science,
1525–­1530. https://cogsci.mindm​odeli​ng.org/2017/paper​s/0299/paper​0299.pdf
Albert, S., & Hofstetter, E. (in prep. a). Data management 1: Privacy, security, and access.
Albert, S., & Hofstetter, E. (in prep. b). Data management 2: Data management as analysis.
Anczyk, A., Grzymała-­Moszczyńska, H., Krzysztof-­Świderska, A., & Prusak, J. (2019). The replication crisis and qualita-
tive research in the psychology of religion. International Journal for the Psycholog y of Religion, 29(4), 278–­291. https://doi.
org/10.1080/10508​619.2019.1687197
Antaki, C. (Ed.). (1988). Analysing everyday explanation. A caseook of methods. Sage.
Austin, J. L. (1962). How to do things with words. Oxford University Press.
Billig, M. (1987). Arguing and thinking. A rhetorical approach to social psycholog y. Cambridge University Press.
Branney, P., Reid, K., Frost, N., Coan, S., Mathieson, A., & Woolhouse, M. (2019). A context-­consent meta-­framework for
designing open (qualitative) data studies. Qualitative Research in Psycholog y, 16(3), 483–­502. https://doi.org/10.1080/14780​
887.2019.1605477
Braun, V., & Clarke, V. (2013). Successful qualitative research: A practical guide for beginners. Sage.
Brown, A. W., Kaiser, K. A., & Allison, D. B. (2018). Issues with data and analyses: Errors, underlying themes and poten-
tial solutions. Proceedings of the National Academy of Sciences of the United States of America, 115(11), 2563–­2570. https://doi.
org/10.1073/pnas.17082​79115
Burke, S., & Demasi, M. A. (2020). “This country will be big racist one day”: Extreme prejudice as reasoned discourse in face-­
to-­face. In Political communication. Discursive perspectives (pp. 205–­2 29). Palgrave Macmillan.
Camerer, C. F., Dreber, A., Holzmeister, F., Ho, T. H., Huber, J., Johannesson, M., Kirchler, M., Nave, G., Nosek, B. A., Pfeiffer,
T., Altmejd, A., Buttrick, N., Chan, T., Chen, Y., Forsell, E., Gampa, A., Heikensten, E., Hummer, L., Imai, T., … Wu,
H. (2018). Evaluating the replicability of social science experiments in nature and science between 2010 and 2015. Nature
Human Behaviour, 2(9), 637–­644. https://doi.org/10.1038/s4156​2 -­018-­0399-­z
Clarke, V. (2021). Navigating the messy swamp of qualitative research: Are generic reporting standards the answer? Qualitative
Research in Psycholog y, 1–­9, 1004–­1012. https://doi.org/10.1080/14780​887.2021.1995555
Clayman, S. E., & Gill, V. T. (2004). Conversation analysis. In M. Hardy & A. Bryman (Eds.), Handbook of data analysis (pp. 589–­
696). Sage.
Clift, R. (2015). Conversation analysis. Cambridge University Press.
de Ruiter, J. P., & Albert, S. (2017). An appeal for a methodological fusion of conversation analysis and experimental psychology.
Research on Language and Social Interaction, 50(1), 90–­107. https://doi.org/10.1080/08351​813.2017.1262050
Demasi, M. A., & Tileagă, C. (2020). Rhetoric of derisive laughter in political debates on the EU. Qualitative Psycholog y., 8, 328–­
342. https://doi.org/10.1037/qup00​0 0156
Dienlin, T., Johannes, N., Bowman, N. D., Masur, P. K., Engesser, S., Kümpel, A. S., Lukito, J., Bier, L. M., Zhang, R., Johnson,
B. K., Huskey, R., Schneider, F. M., Breuer, J., Parry, D. A., Vermeulen, I., Fisher, J. T., Banks, J., Weber, R., Ellis, D. A.,
… Vreese, C. D. (2021). An agenda for Open Science in communication. Journal of Communication, 71(1), 1–­26. https://doi.
org/10.1093/joc/jqz052
Drew, P., Walker, T., & Ogden, R. (2013). Self-­repair and action construction. In M. Hayashi, G. Raymond, & J. Sidnell (Eds.),
Conversational repair and human understanding. Cambridge University Press.
Edwards, D. (2004). Proof procedure. In M. S. Lewis-­Beck, A. Bryman, & T. F. Liao (Eds.), The Sage encyclopaedia of social science
research methods (pp. 875–­976). Sage.
Edwards, D., & Mercer, N. (1987). Common knowledge. The development of understanding in the classroom. Methuen.
Edwards, D., & Potter, J. (1992). Discursive psycholog y. Sage.
Ford, J., Hepburn, A., & Parry, R. (2019). What do displays of empathy do in palliative care consultations? Discourse Studies, 21(1),
22–­37. https://doi.org/10.1177/14614​45618​814030
Freese, J., & Peterson, D. (2017). Replication in social science. Annual Review of Sociolog y, 43, 147–­165. https://doi.org/10.1146/
annur​ev-­soc- ­06011​6 - ­053450
Garfinkel, H. (1967). Studies in ethnomethodolog y. Prentice-­Hall.
Gorgolewski, K. J., & Poldrack, R. A. (2016). A practical guide for improving transparency and reproducibility in neuroimaging
research. PLoS Biolog y, 14(7), e1002506. https://doi.org/10.1371/JOURN​A L.PBIO.1002506
Hagger, M. S., Chatzisarantis, N. L. D., Alberts, H. C. O., Calvillo, D. P., Campbell, W. K., Cannon, P. R., Carlucci, M., Carruth,
N. P., Cheung, T., Crowell, A., De Ridder, D. T. D., Dewitte, S. M., Elson, M., Evans, J. R., Fay, B. A., Fennis, B. M., Finley,
20448309, 2023, 4, Downloaded from https://bpspsychub.onlinelibrary.wiley.com/doi/10.1111/bjso.12568 by Cochrane Philippines, Wiley Online Library on [03/04/2024]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
INTERACTION ANALYSIS AND OPEN SCIENCE    | 1603

A., Francis, Z., Heise, E., … Zwienenberg, M. (2016). A multilab preregistered replication of the ego-­depletion effect.
Perspectives on Psychological Science, 11(4), 546–­573. https://doi.org/10.1177/17456​91616​652873
Hammersley, M. (2010). Can we reuse qualitative data via secondary analysis. Notes on terminological and substantive issues.
Sociological Research Online, 15(5), 1–­7.
Haven, T. L., Errington, T. M., Gleditsch, K. S., van Grootel, L., Jacobs, A. M., Kern, F. G., Piñeiro, R., Rosenblatt, F., &
Mokkink, L. B. (2020). Preregistering qualitative research: A Delphi study. International Journal of Qualitative Methods, 19,
1609406920976417. https://doi.org/10.1177/16094​06920​976417
Haven, T. L., & Van Grootel, L. (2019). Preregistering qualitative research. Accountability in Research, 26(3), 229–­244.
Hepburn, A., & Bolden, G. (2017). Transcribing for social research. Sage.
Hepburn, A., & Potter, J. (2007). Crying receipts: Time, empathy, and institutional practice. Research on Language and Social
Interaction, 40(1), 89–­116.
Hepburn, A., & Potter, J. (2021). Designing research for naturally occurring data. In U. Flick (Ed.), The SAGE handbook of qual-
itative data analysis. Sage.
Hester, S., & Eglin, P. (1997). Culture in action: Studies in membership categorization analysis. International Institute for Ethnomethodology
and Conversation Analysis and University Press of America.
Hoey, E. M., & Kendrick, K. H. (2018). Conversation analysis. In A. M. B. de Groot & P. Hagoort (Eds.), Research methods in
psycholinguistics: A practical guide (pp. 151–­173). Wiley Blackwell.
Humă, B., Alexander, M., Stokoe, E., & Tileagă, C. (2020). Introduction to special issue on discursive psychology. Qualitative
Research in Psycholog y, 17(3), 313–­335. https://doi.org/10.1080/14780​887.2020.1729910
Humă, B., Stokoe, E., & Sikveland, R. O. (2020). Putting persuasion (back) in its sequential context. Qualitative Research in
Psycholog y, 17(3), 357–­371. https://doi.org/10.1080/14780​887.2020.1725947
Jacobs, S. (1988). Evidence and inference in conversation analysis. Annals of the International Communication Association, 11(1), 433–­
443. https://doi.org/10.1080/23808​985.1988.11678700
Jefferson, G. (2004). Glossary of transcript symbols with an introduction. In G. H. Lerner (Ed.), Conversation analysis: Studies from
the first generation (pp. 13–­31). John Benjamins Publishing Company.
Jepson, M., Salisbury, C., Ridd, M. J., Metcalfe, C., Garside, L., & Barnes, R. (2017). The ‘One in a Million’study: creating a
database of UK primary consultations. British Journal of General Practice, 67(658), e345–­e351.
Joyce, J. B., Douglass, T., Benwell, B., Rhys, C. S., Parry, R., Simmons, R., & Kerrison, A. (2022). Should we share qualitative
data? Epistemological and practical insights from conversation analysis. International Journal of Social Research Methodolog y,
1–­15. https://doi.org/10.1080/13645​579.2022.2087851
Joyce, J. B., Humă, B., Ristimäki, H.-­L ., Ferraz de Almeida, F., Doehring, A., de Almeida, F. F., & Doehring, A. (2021). Speaking
out against everyday sexism: Gender and epistemics in accusations of “mansplaining.”. Feminism and Psycholog y, 31(4), 502–­
529. https://doi.org/10.1177/09593​53520​979499
Karhulahti, V. M. (2022). Registered reports for qualitative research. Nature Human Behaviour, 6(1), 4–­5. https://doi.org/10.1038/
S4156​2 -­021-­01265​- ­8
Kuhn, T. S. (1962). The structure of scientific revolutions. University of Chicago Press.
LaMarre, A., & Chamberlain, K. (2022). Innovating qualitative research methods: Proposals and possibilities. Methods in
Psycholog y, 6, 100083. https://doi.org/10.1016/J.METIP.2021.100083
Levelt Committee, Noort Committee, & Drenth Committee. (2012). Flawed Science: The fraudulent research practices of social psychologist
Diederik Stapel. Levelt Committee, Noort Committee, & Drenth Committee.
Levitt, H. M., Bamberg, M., Creswell, J. W., Frost, D. M., Josselson, R., & Suárez-­Orozco, C. (2018). Journal article reporting
standards for qualitative primary, qualitative meta-­a nalytic, and mixed methods research in psychology: The APA publi-
cations and communications board task force report. American Psychologist, 73(1), 26–­4 6. https://doi.org/10.1037/amp00​
00151
Madill, A., Jordan, A., & Shirley, C. (2000). Objectivity and reliability in qualitative analysis: Realist, contextualist and radical
constructionist epistemologies. British Journal of Psycholog y, 91(1), 1–­2 0. https://doi.org/10.1348/00071​26001​61646
Merton, R. K. (1979). The sociolog y of science. Theoretical and empirical investigations. The University of Chicago Press.
Mondada, L. (2018). Multiple temporalities of language and body in interaction: Challenges for transcribing multimodality.
Research on Language and Social Interaction, 51(1), 85–­106. https://doi.org/10.1080/08351​813.2018.1413878
Mozersky, J., Parsons, M., Walsh, H., Baldwin, K., McIntosh, T., & DuBois, J. M. (2020). Research participant views regarding
qualitative data sharing. Ethics and Human Research, 42(2), 13–­25. https://doi.org/10.1002/eahr.500044
Mozersky, J., Walsh, H., Parsons, M., McIntosh, T., Baldwin, K., & DuBois, J. M. (2020). Are we ready to share qualita-
tive research data? Knowledge and preparedness among qualitative researchers, IRB members, and data repository
curators. The International Association for Social Science Information Service and Technolog y Quarterly, 8(43), 1–­2 6. https://doi.
org/10.29173/​iq952
Munafò, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., Percie Du Sert, N., Simonsohn, U., Wagenmakers,
E. J., Ware, J. J., & Ioannidis, J. P. A. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1(1). https://
doi.org/10.1038/s4156​2 -­016-­0 021
Nelson, N. C., Ichikawa, K., Chung, J., & Malik, M. M. (2021). Mapping the discursive dimensions of the reproducibility crisis:
A mixed methods analysis. PLoS One, 16(7), e0254090. https://doi.org/10.1371/JOURN​A L.PONE.0254090
20448309, 2023, 4, Downloaded from https://bpspsychub.onlinelibrary.wiley.com/doi/10.1111/bjso.12568 by Cochrane Philippines, Wiley Online Library on [03/04/2024]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
1604 |    HUMA and JOYCE

Open Science Collaboration. (2012). An open, large-­scale, collaborative effort to estimate the reproducibility of psychological
science. Perspectives on Psychological Science, 7(6), 657–­660. https://doi.org/10.1177/17456​91612​4 62588
Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251), 943–­950. https://
doi.org/10.31219/​osf.io/447b3
Piñeiro, R., & Rosenblatt, F. (2018). Pre-­a nalysis plans for qualitative research. Revista de Ciencia Política, 36(3), 785–­796.
Potter, J., & Hepburn, A. (2020). Shaming interrogatives: Admonishments, the social psychology of emotion, and discur-
sive practices of behaviour modification in family mealtimes. British Journal of Social Psycholog y., 59, 347–­364. https://doi.
org/10.1111/bjso.12346
Potter, J., & Shaw, C. (2018). The virtues of naturalistic data. In U. Flick (Ed.), The SAGE handbook of qualitative data collection (pp.
182–­199). Sage.
Potter, J., & Wetherell, M. (1987). Discourse and social psycholog y: Beyond attitudes and behaviour. Sage.
Pownall, M., Talbot, C. V., Henschel, A., Lautarescu, A., Lloyd, K., Hartmann, H., Darda, K. M., Tang, K. T. Y., Carmichael-­
Murphy, P., & Siegel, J. A. (2021). Navigating Open Science as early career feminist researchers. Psycholog y of Women
Quarterly, 45(4), 526–­539. https://doi.org/10.31234/​OSF.IO/F9M47
Psathas, G. (1990). Introduction: Methodological issues and recent developments in the study of naturally occurring interac-
tion. In G. Psathas (Ed.), Interaction competence (pp. 1–­30). International Institute for Ethnomethodology and Conversation
Analysis.
Ritchie, S. (2020). Science fictions. How fraud, bias, negligence, and hype undermine the search for truth. Random House.
Sacks, H. (1984). Notes on methodology. In J. M. Atkinson & J. Heritage (Eds.), Structures of social action: Studies in conversation
analysis (pp. 21–­27). Cambridge University Press.
Sacks, H. (1992). Lectures on conversation (vols. 1 and 2). Basil Blackwell.
Sacks, H., Schegloff, E. A., & Jefferson, G. (1974). A simplest systematics for the organisation of turn-­taking for conversation.
Language, 50(4), 696–­735. https://doi.org/10.1016/B978-­0 -­12-­62355​0 -­0.50008​-­2
Schegloff, E. A. (1992). Introduction. In Lectures on conversation. (Vol. I (pp. ix–­ixii). Blackwell Publishers.
Schegloff, E. A. (1996). Confirming allusions: Toward an empirical account of action. American Journal of Sociolog y, 104, 161–­216.
Schegloff, E. A. (2007). A tutorial on membership categorization. Journal of Pragmatics, 39(3), 462– ­482.
Sciubba, E., Shrikant, N., & Williamson, F. (2021). Guest blog: EM/CA for racial justice. Research on Language and Social Interaction.
Blog. https://rolsi.net/2021/06/02/guest​-­blog-­em-­ca-­for-­racia​l-­justi​ce/
Silverman, D. (2011). Interpreting qualitative data. Sage.
Silverman, D. (2017). Doing qualitative research. Sage.
Stainton Rogers, W. (2011). Social psycholog y (2nd ed.). McGraw Hill Open University Press.
Stokoe, E. (2020). Psychological matters in institutional interaction: Insights and interventions from discursive psychology and
conversation analysis. Qualitative Psycholog y, 7(2), 331–­347. https://doi.org/10.1037/qup00​0 0162
Sukumar, P. T., & Metoyer, R. (2019). Replication and transparency of qualitative research from a constructivist perspective. https://doi.
org/10.31219/​OSF.IO/6EFVP
Tackett, J. L., Brandes, C. M., King, K. M., & Markon, K. E. (2019). Psychology's replication crisis and clinical psychological
science. Annual Reviews in Clinical Psycholog y, 15, 579–­604. https://doi.org/10.1146/ANNUR​EV-­CLINP​SY-­05071​8 -­095710
Tamminen, K. A., & Poucher, Z. A. (2018). Open science in sport and exercise psychology: Review of current approaches
and considerations for qualitative inquiry. Psycholog y of Sport and Exercise, 36, 17–­28. https://doi.org/10.1016/j.psych​
sport.2017.12.010
ten Have, P. (2007). Doing conversation analysis. Sage.
UKRI. (2016). Concordat on Open Research Data. https://www.ukri.org/files/​legac​y/docum​ents/conco​rdato​nopen ​resea​rchda​
ta-pdf/
Vicente-­Saez, R., & Martinez-­Fuentes, C. (2018). Open Science now: A systematic literature review for an integrated definition.
Journal of Business Research, 88, 428–­436.
Weatherall, A., & Stubbe, M. (2015). Emotions in action: Telephone-­mediated dispute resolution. British Journal of Social Psycholog y,
54(2), 273–­290. https://doi.org/10.1111/bjso.12082

How to cite this article: Huma, B., & Joyce, J. B. (2023). ‘One size doesn't fit all’: Lessons from
interaction analysis on tailoring Open Science practices to qualitative research. British Journal of
Social Psycholog y, 62, 1590–1604. https://doi.org/10.1111/bjso.12568

You might also like