0% found this document useful (0 votes)
91 views10 pages

Barnett (2021) - .Enacting More-Than-Human Care Clients' and Counsellors-De Luat

The document discusses a study exploring clients' and counselors' views on the potential for chatbots to provide alcohol and drug counseling. Both clients and counselors saw limitations in chatbots' lack of human elements like empathy. However, they also saw potential for chatbots to perform discrete tasks like screening, triage, and referral. The study aims to discuss implications for how artificial intelligence may shape future addiction treatment and care.

Uploaded by

Sebastian Fitzek
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
91 views10 pages

Barnett (2021) - .Enacting More-Than-Human Care Clients' and Counsellors-De Luat

The document discusses a study exploring clients' and counselors' views on the potential for chatbots to provide alcohol and drug counseling. Both clients and counselors saw limitations in chatbots' lack of human elements like empathy. However, they also saw potential for chatbots to perform discrete tasks like screening, triage, and referral. The study aims to discuss implications for how artificial intelligence may shape future addiction treatment and care.

Uploaded by

Sebastian Fitzek
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Since January 2020 Elsevier has created a COVID-19 resource centre with

free information in English and Mandarin on the novel coronavirus COVID-


19. The COVID-19 resource centre is hosted on Elsevier Connect, the
company's public news and information website.

Elsevier hereby grants permission to make all its COVID-19-related


research that is available on the COVID-19 resource centre - including this
research content - immediately available in PubMed Central and other
publicly funded repositories, such as the WHO COVID database with rights
for unrestricted research re-use and analyses in any form or by any means
with acknowledgement of the original source. These permissions are
granted for free by Elsevier for as long as the COVID-19 resource centre
remains active.
International Journal of Drug Policy 94 (2021) 102910

Contents lists available at ScienceDirect

International Journal of Drug Policy


journal homepage: www.elsevier.com/locate/drugpo

Research paper

Enacting ‘more-than-human’ care: Clients’ and counsellors’ views on the


multiple affordances of chatbots in alcohol and other drug counselling
Anthony Barnett a,∗, Michael Savic a, Kiran Pienaar b, Adrian Carter c, Narelle Warren d,
Emma Sandral a, Victoria Manning a, Dan I. Lubman a
a
Monash Addiction Research Centre, Eastern Health Clinical School, Monash University, VIC and Turning Point, Eastern Health, Richmond, VIC, Australia
b
School of Humanities and Social Sciences, Deakin University, Melbourne, VIC, Australia; and School of Social Sciences, Monash University, Melbourne, VIC, Australia
c
School of Psychological Sciences and Turner Institute for Brain and Mental Health, Monash University, Melbourne, VIC, Australia and University of Queensland Centre
of Clinical Research, University of Queensland, Brisbane, QLD, Australia
d
School of Social Sciences, Faculty of Arts, Monash University, Melbourne, VIC, Australia

a r t i c l e i n f o a b s t r a c t

Keywords: Forms of artificial intelligence (AI), such as chatbots that provide automated online counselling, promise to revo-
Chatbots lutionise alcohol and other drug treatment. Although the replacement of human counsellors remains a speculative
Artificial intelligence prospect, chatbots for ‘narrow AI’ tasks (e.g., assessment and referral) are increasingly being used to augment
Online care
clinical practice. Little research has addressed the possibilities for care that chatbots may generate in the future,
Addiction
particularly in the context of alcohol and other drug counselling. To explore these issues, we draw on the concept
Australia
Affordances of technological ‘affordances’ and identify the range of possibilities for care that emerging chatbot interventions
may afford and foreclose depending on the contexts in which they are implemented. Our analysis is based on
qualitative data from interviews with clients (n=20) and focus group discussions with counsellors (n=8) con-
ducted as part of a larger study of an Australian online alcohol and other drug counselling service. Both clients
and counsellors expressed a concern that chatbot interventions lacked a ‘human’ element, which they valued in
empathic care encounters. Most clients reported that they would share less information with a chatbot than a
human counsellor, and they viewed this as constraining care. However, clients and counsellors suggested that
the use of narrow AI might afford possibilities for performing discrete tasks, such as screening, triage or referral.
In the context of what we refer to as ‘more-than-human’ care, our findings reveal complex views about the types
of affordances that chatbots may produce and foreclose in online care encounters. We conclude by discussing
implications for the potential ‘addiction futures’ and care trajectories that AI technologies offer, focussing on
how they might inform alcohol and other drug policy, and the design of digital healthcare.

Introduction bots informed by artificial intelligence (AI) technologies to respond to


healthcare needs during a pandemic (e.g., World Health Organisation,
Within healthcare, the possible future applications of ‘chatbots’ — 2020).
artificially intelligent, computer programs that aim to simulate human As part of a wider investment in digital health, chatbots continue
conversation — continue to receive significant attention. Recent studies to be framed as an important part of future ‘e-therapies’ within psy-
have explored the potential of chatbots to deliver healthcare informa- chiatry and mental health services for the treatment of mental health
tion and support (Rizzo et al., 2016), detect and prevent certain be- and alcohol and other drug concerns (Gratzer & Goldbloom, 2020). In
haviours (e.g., suicide) (Martínez-Miranda, 2017), and support treat- the National Health Service Topol Review which investigated the appli-
ment delivered by physicians in different fields of medicine (e.g., on- cation of technology within mental healthcare, the use of chatbots to
cology) (Bibault, Chaix, Nectoux, & Brouard, 2019). Furthermore, the deliver mental health services in the future was specifically identified
novel coronavirus (COVID-19) pandemic has encouraged a rapid move as an important part of a suite of automated, digital health interven-
towards telemedicine and online healthcare, including the use of chat- tions (Foley & Woollard, 2019). Universities, governments and private
software companies are investing at unprecedented rates in chatbot and


Corresponding author.
E-mail address: [email protected] (A. Barnett).

https://doi.org/10.1016/j.drugpo.2020.102910

0955-3959/© 2020 Elsevier B.V. All rights reserved.


A. Barnett, M. Savic and K. Pienaar et al. International Journal of Drug Policy 94 (2021) 102910

mobile health (‘mHealth’) technologies (Silva, Rodrigues, de la Torre On the one hand, chatbots may offer future opportunities for healthcare
Díez, López-Coronado, & Saleem, 2015). These technologies promise to if they become more ‘intelligent’ and conversational barriers with pa-
change how care is provided by delivering mental health services to tients are minimised. On the other, it is unclear whether and how chat-
larger audiences at cheaper cost, beyond the time, space and geograph- bots could emulate the traditional doctor-patient relationship, which is
ical constraints of traditional, face-to-face healthcare. built on trust and face-to-face communication (Denecke et al., 2019).
Although chatbots are framed as a futuristic technological solution Empirical research examining the impact of chatbots on the thera-
to overcome barriers to treatment access, the views of clients and coun- peutic relationship, care experiences and outcomes is urgently needed
sellors about how these technologies may impact care have largely re- (Laranjo et al., 2018). A recent survey of physicians in the US found that
mained underexplored. Drawing on concepts from science and technol- while many viewed chatbots as potentially useful for undertaking sim-
ogy studies (STS), we critically examine clients’ and counsellors’ views ple administrative tasks, such as booking appointments or client coach-
about the affordances of chatbots for alcohol and other drug care. In ing, concerns were raised about their inability to comprehend human
doing so, we provide a novel perspective on the potential social impli- emotion and to deliver expert medical care. Little research has been con-
cations of digital health technologies and reflect on the possibilities for ducted to examine the views of other types of providers, including coun-
‘more-than-human’ care into the future. sellors, who play a critical role in the alcohol and other drug treatment
field. Outside of survey based research, only a few qualitative studies
Chatbots and healthcare delivery have explored health professionals’ and clients’ views about the accept-
ability of chatbots in healthcare more generally (e.g., Laumer, Maier, &
The potential use of chatbots to deliver healthcare inter- Gubler, 2019; Nadarzynski, Miles, Cowie, & Ridge, 2019).
ventions, such as counselling, has a surprisingly long history. Crucially, though, little if any research has explored clients’ and
McCorduck (2004) traced the history of chatbots back to Joseph Weizen- counsellors’ views about how chatbots used in future alcohol and other
baum, when in the early 1960s at MIT he produced a system called drug service delivery may impact their experiences of care. This is sur-
‘ELIZA’. ELIZA was a computer program based on early forms of natu- prising because digital health interventions may be especially relevant
ral language processing — the use of automated techniques to analyse or useful for people with alcohol and other drug concerns in order to
and respond to human language — and was designed to imitate the role overcome barriers to accessing care, including: (i) drug-related stigma
of a Rogerian psychoanalyst by communicating with a human user (or that may discourage treatment seeking; and, (ii) service delivery barri-
‘patient’) on a computer console. At the time, Weizenbaum argued that ers such as lack of access to care in remote areas (Budney, Borodovsky,
ELIZA was better viewed as an advance in language processing technol- Marsch, & Lord, 2019). Drawing on concepts from STS, including
ogy rather than representing a new technology for healthcare. By the work on technological ‘affordances’ (Gibson, 1979; Hutchby, 2001; La-
mid-1970s, Weizenbaum had launched a critique of the developing field tour & Venn, 2002; Norman, 1988) and ‘more-than-human’ approaches
of AI, arguing that chatbots and AI systems designed to replace humans (Dennis, 2019), this paper responds to this opening in the literature by
were immoral, unethical and lacked the human capacity for empathy examining clients’ and counsellors’ perceptions of the technological and
on which effective psychotherapy depends (McCorduck, 2004). social effects of chatbots in online alcohol and other drug care. The find-
The technological capability of chatbots has advanced since ELIZA. ings may have important implications for the design of future digital
Current chatbots aim to employ various forms of AI technologies to en- healthcare interventions and the kinds of ‘addiction futures’ that these
able greater autonomy. These include natural language processing and interventions materialise and foreclose in alcohol and other drug care.
machine learning, where a computer system ‘learns’ and improves per-
formance based on past experiences and interactions (Nguyen, 2020). Theoretical approach
One example is ‘SimSensei’, a fully automated chatbot that conducts in-
terviews to assess psychological distress (Morency et al., 2015). Within In exploring clients’ and counsellors’ accounts of the future techno-
alcohol and other drug service delivery, a number of chatbots of varying logical and social effects of chatbots in online care, our analysis is in-
technical complexity have been developed. These include ‘TalkToFrank’ formed by the concept of ‘affordances’ (Gibson, 1979; Hutchby, 2001;
in the United Kingdom, which is designed to provide young people Latour & Venn, 2002; Norman, 1988). In his seminal work The Ecologi-
with information about drugs (Home Office, 2013), and ‘Bzz’ in the cal Approach to Visual Perception, Gibson (1979) described ‘affordances’
Netherlands, which promises to answer adolescents’ questions related in terms of how environments offered or ‘furnished’ animals different
to sex, drugs and alcohol (Crutzen, Peters, Portugal, Fisser, & Grolle- opportunities. A key tenet of Gibson’s work is that affordances are not
man, 2011). innate, fixed physical properties of an environment, but rather emerge
Viewed along a continuum, chatbots operating within ‘hybrid’ mod- as opportunities or constraints that an environmental feature might pro-
els of care alongside humans to perform simple tasks such as screening or vide to a particular subject. For Gibson, different features of an envi-
referral have been characterised as examples of ‘narrow AI’. In contrast, ronment have the ability to provide unique affordances for particular
future chatbots designed to replace human functions (e.g., to emulate a subjects (or even to the same subject at different points in time).
counsellor) are considered forms of ‘artificial superintelligence’ (Müller Norman (1988) extended Gibson’s (1979) work by applying the
& Bostrom, 2016). Traditional models of care involving face-to-face in- concept of ‘affordances’ to human-computer interactions. Where Gib-
teractions between clients and clinicians are increasingly being aug- son highlighted the role of environments, Norman (1988) foregrounded
mented by narrow AI digital health interventions, such as smartphone the agency of technology designers and developers by arguing that af-
applications or simple chatbots that can perform basic functions, such fordances are ‘designed in’ properties that provide indications of how
as referring clients to a human-led service (Denecke, Tschanz, Dorner, technology could be used. Drawing on STS and posthumanist theory,
& May, 2019). socio-material approaches have extended thinking around affordances
The promise of, and increasing investment in, digital health inter- (Hutchby, 2001; Latour & Venn, 2002). Rather than viewing affordances
ventions raises the question of whether chatbots offer ‘hype or hope’ as predominantly shaped by the environment or determined by humans
for future healthcare. In a recent review, Denecke et al. (2019) sug- who design and use technology, socio-material approaches view affor-
gested that the strengths of chatbots include their capacity to follow a dances as emerging through human and non-human actors as they coa-
‘conversational tree’ and perform simple, specific tasks, such as patient lesce in encounters with technology. Given that human and non-human
history-taking or patient education. However, they cautioned that chat- actors (e.g., discourses, time, places, objects) may combine differently in
bots are not without disadvantages. For example, patients may become specific encounters with a technology, such as a chatbot, the opportuni-
exhausted or frustrated if a chatbot does not understand their concerns ties for action (or ‘affordances’) that a technology enables or constrains
and needs or if too many patient interactions with chatbots are required. are always contingent and situated. In view of this, a socio-material
A. Barnett, M. Savic and K. Pienaar et al. International Journal of Drug Policy 94 (2021) 102910

approach underscores the need to examine the unique relations of hu- and nurturing, irrespective of the local contexts in which it is enacted.
man and non-human forces in specific contexts in order to trace how Questioning this framing of care as a one-directional form of service
affordances are differentially constituted (Dilkes-Frayne, Savic, Carter, provision, our analysis draws on critical scholarship that emphasises
Kokanović, & Lubman, 2019). care as relational, situated and made in everyday practices, including
Within critical drug studies, this socio-material approach to ‘af- those associated with help-seeking, diagnosis and treatment (Puig de la
fordances’ has been mobilised by Fraser, Treloar, Gendera, and Bellacasa, 2017; Mol, 2008). This work considers care as differentially
Rance (2017) to critically analyse the design of injecting packs for hep- constituted — sometimes as supportive and sometimes as oppressive
atitis C prevention. Recognising the need to incorporate social relation- and coercive — depending on the unique configuration of human and
ships in the design of the prevention object, Fraser and colleagues pro- non-human actors at work in specific care practices. Given the potential
posed a new injecting pack aimed at couples who inject together. This for care to materialise as coercive, and given that some forms of care
innovative approach to hepatitis C prevention treats the sexual partner- (e.g., between people who consume drugs) are routinely obscured, care
ship as the primary unit of intervention and aims to generate new af- is political, contested and implicated in the making and maintenance of
fordances or possibilities for prevention. Socio-material approaches to particular realities (Puig de la Bellacasa, 2017; Mol, 2008; Martin et al.,
affordances have since been productively applied to analyse a range of 2015; Murphy, 2015). Recent critical drug studies scholarship has sim-
phenomena including the uses and effects of naloxone (an overdose re- ilarly approached care as an ethico-political pursuit and productively
versal drug) (Farrugia et al., 2019), supervised drug consumption sites traced the social, affective and material practices that bring care into
(Boyd et al., 2020) and, related to our own work, the relationship be- being, or otherwise constrain, or foreclose specific care practices in re-
tween online counselling platforms and therapeutic outcomes (Dilkes- lation to injecting drug use (Dennis, 2019), naloxone (Farrugia et al.,
Frayne, Savic, Carter, Kokanović, & Lubman, 2019). The concept of af- 2019), and drug consumption rooms (Duncan et al., 2019). Inspired by
fordances has also been mobilised in a study of the gender affordances this work, we examine the affordances and limits of AI technologies for
of chatbots, specifically how the gender of a chatbot influences users’ online care, paying particular attention to the social, affective and ma-
engagement with it (Brahnam & De Angeli, 2012). Brahnam and De An- terial actors at play.
geli found that users tended to attribute negative stereotypes to female- The advent of new technologies such as chatbots holds the potential
presenting chatbots more often than male-presenting chatbots, and that for a redistribution of care between human and non-human actors. As
the former were more often subject to implicit and explicit sexual atten- Donna Haraway (1985; 2006) notes in her influential book, A Cyborg
tion and swear words. Manifesto, the distinction between animal and machine is increasingly
By drawing on a socio-material approach to affordances, we explore being blurred: a blurring that unsettles anthropocentric accounts of the
clients’ and counsellors’ views on how chatbots might enable (‘afford’) unified human subject and illuminates the shift to the hybridised posthu-
or constrain alcohol and other drug online care. While the existing crit- man of technoscience. While Haraway’s concept of the cyborg is often
ical drug studies literature has applied the concept of ‘affordances’ to cited as a feminist critique of gender, the blurring of the human/non-
explore past encounters with technology, we focus on participants’ per- human in the image of the cyborg has far-reaching implications, includ-
ceptions of future encounters involving chatbots. In doing so, we attend ing for understandings of treatment and care. In the context of our work,
to the role of anticipation and imagination, along with a range of other the disruption of a hard and fast distinction between the human and the
actors, in shaping the types of realities, possibilities and actions that may non-human invites a ‘more-than-human’ approach capable of capturing
emerge when chatbots are encountered in future implementation situa- the dispersal of care across human/non-human (or ‘more-than-human’)
tions (Groves, 2017). In this way, we approach participants’ accounts as relations. Applying this approach to analysing online care prompts us
implicated in making alcohol and other drug treatment and care futures. to (re)consider how human and chatbot hybridised models (where hu-
Our analysis is also informed by an emerging body of scholarship man and non-human actors work together and shape each other) afford
applying ‘more-than-human’ approaches within critical drug studies. multiple possibilities for care with varying implications for alcohol and
Fay Dennis’ (2017, 2019) ethnography of injecting drug use is a key other drug treatment futures.
example. In keeping with the posthumanist turn, Dennis shifts the fo-
cus away from the individual injecting subject by exploring how inject- Methods
ing drug use emerges through the relations of human and non-human
actors. In doing so, her account disrupts anthropocentric conceptuali- The qualitative data presented in this article were part of a broader
sations of drug use as the outcome of human practices and decisions. mixed-methods study that explored clients’ and counsellors’ experiences
Instead her work illuminates how drug injecting events materialise via of online care. The study was approved by Eastern Health Human Re-
the often fragile coalescence of human and non-human phenomena (e.g., search Ethics Committee (Reference: HREC/18/EH/38).
syringes, substances, prohibitionist drug policies, the availability of ster-
ile injecting equipment). By unsettling taken-for-granted distinctions be- Participants
tween human/non-human, subject/object and agency/passivity, Dennis’
approach encourages a more capacious understanding of agency as dis- All participants had experience of an Australian national online, 24-
tributed along the human/non-human spectrum, rather than being the hour real-time web-chat alcohol and other drug counselling service,
sole prerogative of individual human subjects. She concludes by advo- Counselling Online. Clients who had used the Counselling Online ser-
cating for a ‘more-than-human’ approach to care, one that extends be- vice were directed to a site after their counselling session ended where
yond current harm reduction approaches and has the potential to “re- they could register their interest in participating in this study. After a
configure our relationship to drugs and legitimise ways of living with client had registered their interest, we contacted them to discuss their
drugs that are currently neglected, undermined, or worse still, punished” participation. In total, 10 male and 10 female clients were recruited.
(Dennis, 2019, p. 199). Client participants had a mean age of 38 years (ranging from 22 to
In the context of chatbots and online care, a ‘more-than-human’ ap- 76 years; NB: age not available for one participant) and were located
proach invites us to rethink dominant addiction treatment models that across a range of different states of Australia. Client participants’ previ-
tend to frame care as a human-centric, one-directional practice, which is ous engagements with alcohol and other drug treatment services varied.
provided to clients by treatment professionals. For example, in the con- Many clients had seen a general practitioner in primary health care to
ventional ‘doctor-patient relationship’, the medical professional is often discuss their alcohol and other drug concerns, some had experience of
conceptualised as the care provider, in control of (and responsible for) specialist addiction treatment services, and for a few, Counselling On-
the clinical encounter. Despite the power asymmetries at play in clin- line was the only service they accessed in the past. The primary drug
ical contexts, care is also typically considered self-evidently beneficial of concern that clients accessed online counselling for included: alcohol
A. Barnett, M. Savic and K. Pienaar et al. International Journal of Drug Policy 94 (2021) 102910

(45%), methamphetamine (30%), cannabis or synthetic cannabis (15%), Even though Jill had engaged with counselling delivered via text
cocaine (5%) and pharmaceutical medications (5%). We report clients’ through an online interface, she reported feeling that the care she expe-
primary drug of concern in the findings when presenting narrative ex- rienced had an affective element which only a human counsellor could
cerpts. provide. That is, the caring, came in human form. However, other clients
Counsellors who participated in the study were all currently working were less sure about the type of counsellor (whether human or chatbot)
at Counselling Online. Counsellors were recruited via Counselling On- with which they were communicating. Indeed, some counsellors dis-
line team leaders who referred counsellors to our study based on their cussed how at the start of an online counselling session, certain clients
availability, and desire to participate. In total, 8 counsellor participants would seek to explicitly establish what type of counsellor they were
were recruited, including 5 males and 3 females. The median duration talking to by asking if the counsellor was a robot or human. As illus-
of counsellors working at the service was 2 years (ranging from 1 to trated in the following exchange, some counsellors commented that it
over 10 years). The counsellors came from a range of professional back- was difficult to ‘prove’ their status as a human counsellor in an online
grounds including social work, psychology and counselling, and had re- care encounter:
ceived specialised training in telephonic and online alcohol and other
Counsellor 3: Yeah. There are times where they’ll [clients] come into the
drug counselling.
conversation and think you’re a robot.
Counsellor 2: Yes, I’ve had that question.
Data Collection Counsellor 1: ‘Is this a real person?’
Counsellor 3: How to respond to that?
The interviews and focus groups were conducted by AB, MS and ES. Counsellor 1: Every response that you type still sounds like a robot!
Twenty interviews were conducted with clients. The interview sched- Counsellor 2: ‘You sound like a robot!’
ule covered a range of topics, including: clients’ experiences of online Counsellor 1: ‘No I’m not!’ But that’s what a robot would say.
care; clients’ views about what constitutes quality online care; and, the Counsellor 2: Yeah, exactly. ‘I am here to support you.’
focus of the current article, clients’ views about how chatbots afford or [Laughter]
constrain online care. Interviews were conducted over the phone and
Hence, some clients expressed the expectation that chatbots may al-
audio-recorded, and clients received a $30AUD voucher for their partic-
ready be in use. The status of the ‘human’ counsellor as naturally given
ipation.
and uncontentious could no longer be taken for granted with the advent
Three focus groups were conducted with counsellors (two focus
of new digital technologies, which blur the boundaries between human
groups contained three participants and one contained two partici-
and non-human.
pants). The focus group schedule included a range of topics including:
Many participants drew on their previous, often negative or frus-
counsellors’ experiences of online care; counsellors’ views about strate-
trating, everyday life experiences when describing the affordances and
gies and techniques to deliver empathic care online; and, as reported in
constraints of chatbots. For example, some commented that discussing
this paper, counsellors’ views about how chatbots might afford or con-
the topic of chatbots reminded them of long wait times and service dif-
strain online care. Focus groups were conducted in person and audio-
ficulties when interacting with, for example: “voice recognition where
recorded.
like you ring the ATO (Australian Taxation Office) […] to go through
a whole bunch of stuff to get to what you actually physically want”
Data Analysis (Kate, Female, 36, alcohol), or communicating with a chatbot at “Tel-
stra (an Australian telecommunications company) […] to be hit with a
Interviews and focus group audio recordings were transcribed ver- computer, it’s impersonal.” (John, Male, 61, alcohol). These comments
batim. Transcripts were imported into the NVivo qualitative database suggest that previous encounters with task-specific chatbots with limited
management program and thematic analysis (Braun & Clarke, 2006) was capacity to emulate human conversation shaped participants’ views on
conducted. This involved AB, MS and ES developing an initial coding the affordances/constraints of chatbots for online care. If their previous
framework based on readings of a few transcripts, which was then dis- encounters with task-specific chatbots were at best underwhelming, or
cussed and agreed on by all authors. AB then coded the transcripts. at worst frustrating, it is perhaps no surprise that participants expressed
Themes were developed collaboratively in discussions between the au- reservations about the potential for chatbots to deliver empathic online
thors and were informed by our reading of the data in relation to the care (as we discuss in the next section).
theoretical work on affordances and ‘more-than-human’ approaches. However, beyond these mundane and often frustrating everyday in-
teractions, some participants drew on science fiction imaginaries from
Findings popular culture when considering the potential use of chatbots in the
future. For instance, in a focus group discussion, counsellors discussed
In conducting our analysis, we begin by discussing how clients and whether they had all seen “The Mirror” or “Terminator” in a humorous
counsellors imagined and encountered chatbots in everyday life. We way when referencing familiar science fiction tropes of machines tak-
then present clients’ and counsellors’ views on the affordances and ing over the world. Clients also mentioned science fiction. For example,
constraints of chatbots for online care and client interactions. We use when the interviewer commented that a participant seemed knowledge-
pseudonyms to protect participant confidentiality. able about chatbots, the participant explained: “I’ve watched a bit of
movies, man, and I do a bit of research” (Bryce, Male, 26, cannabis).
In these examples, participants often humorously drew upon imaginar-
Imagining and encountering chatbots
ies that depicted a dystopian future where machines/robots replace or
conflict with humans. For some participants, such a future, while still in
Many clients stated that they knew they had communicated with a
the realm of the imaginary, left little scope for considering the positive
human counsellor during their online counselling sessions based on the
possibilities these technologies may afford.
responses received. As one client commented:
You could tell that by the way they answered me and asked me questions Perceived affordances/constraints of chatbots for online care
[…]
Yeah, you could tell it wasn’t a [chatbot] – I’ve had one of those computer- Having described participants’ general comments about chatbots, we
generated things before. No, no I could tell this was an actual person on now present their accounts of the potential affordances and constraints
the other end. (Jill, Female, 59, alcohol) of chatbots for online care.
A. Barnett, M. Savic and K. Pienaar et al. International Journal of Drug Policy 94 (2021) 102910

Chatbots as constraining empathic care the history of psychology had mental health issues themselves, so they
would understand and that kind of thing. I don’t think that that could be
Many participants expressed concerns that if chatbots were to re- properly replicated.
place human counsellors in online counselling, empathic care afforded
by human-delivered counselling would be constrained. As Bryce (Male, Sarah’s account suggests that genuine empathy cannot be a prop-
26, cannabis) stated: erty designed and built into a machine nor acquired through AI tech-
nologies (for example, machine learning). For Sarah, the capacity of an
It’s like trying to make a machine understand human life. You can’t do actor to experience emotional pain was confined to human subjects. Im-
it, unless they have autonomous – unless they are aware – like, they plicit here is an understanding of empathy as the sole prerogative of
have sentience. Let’s put it that way. You can’t make a computer under- humans. Importantly, empathy is seen as contingent on the capacity to
stand what a human is feeling. They can’t exactly give the right answers, have experienced emotional pain, and similar kinds of emotional pain in
whereas a normal human can. particular, and thus be able to imagine (or empathise with) the pain of
others. In turn, the use of chatbots without this capacity in online coun-
Bryce and other participants raised the concern that the affective
selling settings would constrain empathic care, which may undermine
capacities of care would be diminished by chatbots, which were seen
the therapeutic relationship. As Rachel (Female, 22, methamphetamine)
as lacking empathy and emotional intelligence. Related to this, partic-
explained, clients just want to be listened to by “human ears or seen by
ipants expressed concern that chatbots would be unable to respond to
human eyes.”
clients’ desires and concerns. In light of Puig de la Bellacasa’s (2017) en-
In another example, Joe (Male, 25, methamphetamine and di-
gagement with care as an ethico-political obligation, chatbots may be
azepam) emphasised that chatbots may prevent human connections,
seen in this sense as potentially lacking response-ability — the capacity
which he viewed as constituting a potential threat to delivering em-
to respond affectively to the concerns, emotions and desires of others, in
pathic care in the digital age:
short to display empathy and compassion. While not wishing to dismiss
participants’ concerns, Bryce’s view that AI cannot provide the “right an- Because [a chatbot] stops humans from connecting to other humans and
swers” could also be interpreted as resting on the assumption that there in a world of digital technology we need to hold onto that humanity.
are correct or incorrect responses in online care encounters that only
a human can provide. However, complicating the anthropocentric no- Joe’s account reiterates the desire for the human not to be lost amidst
tion that providing the “right answers” is the sole prerogative of human the proliferation of digital technologies and human-technology rela-
subjects, critical drug studies scholarship has foregrounded the many tions. His account is consistent with discourses about technologies as so-
possible (and often contested/political) ways of enacting ‘problems’ and cial ills, as ‘addictive’, as eroding social connection and community, and
‘responses’ in relation to alcohol and other drugs – not necessarily right the sorts of dystopian imaginaries described earlier. However, it runs
or wrong in perpetuity but socio-historically situated, and as such, re- counter to dominant discourses in digital health, where digital health
lationally constituted across the human-non-human spectrum with dif- technologies are promoted for their potential to connect people — in-
ferent effects (see for example: Barnett, Dilkes-Frayne, Savic, & Carter, deed the importance of digital connections has been thrown into stark
2018; Fraser & Moore, 2011; Lancaster, Seear, Treloar, & Ritter, 2017; relief during the COVID-19 pandemic. Against this view, Joe highlights
Pienaar & Savic, 2016; Savic, Ferguson, Manning, Bathish, & Lubman, the central role of human actors in alcohol and other drug treatment
2017). futures and online care.
Other participants also questioned the capacity of chatbots to weigh Finally, counsellors agreed that chatbots were not sufficiently tech-
up complex information, read emotions, and respond with empathy and nologically advanced to be programmed to offer empathic counselling.
compassion. For example, Rob (Male, 30, alcohol) stated that chatbots However, some forecast a not-so-distant future in which these con-
may not afford the empathic care that people desire in online care set- straints could be overcome through advances in chatbots and AI:
tings: Counsellor 5: If the chatbot was perfectly programmed and was able to
Oh, the human counsellor can obviously show empathy and emotion display levels of…
where the automation can’t. A lot of people access these things just to Counsellor 7: Intelligence.
speak to someone. Counsellor 5: …empathy and yeah, intelligence and interact flexibly with
the client, explore with them and then do all that stuff, I still feel like even
Similarly, Jess (Female, 53, methamphetamine) noted: if the client knew or even if it was a person on the other end who said,
‘yes, I’m a chatbot’, I think that that on its own would be a significant
Yeah I think there needs to be, it needs to have a human face behind it
enough factor to take away from the perceived value of that service be-
because of the empathy that’s needed to support someone on their journey.
cause they’re not being – I imagine they feel…
As these accounts indicate, many client participants expressed the Counsellor 7: Not being heard.
view that human actors are still needed in online counselling to de- Counsellor 5: …like they’re not being heard by a person. That they’re not
liver empathic care. That is, the full replacement of human actors with being supported. That they’re still in this alone.
chatbots within online care was viewed as undermining the important Facilitator 1: Yeah, okay, that…
empathic qualities desired in care encounters. In view of recent critical Counsellor 5: I don’t know how you’d humanise a program to the point
drug studies scholarship on care (see Dennis, 2019; Duncan et al., 2019; where it feels like there’s actually somebody there.
Farrugia et al., 2019), these participants’ accounts challenge the notion Facilitator 1: Yeah, that’s…
that chatbots are able to satisfy one element of caring: the affective com- Counsellor 6: It depends how far in the future we’re talking. If we’re
ponent of helping people to feel a certain way. talking the next five years but if we’re talking 20 years maybe that’s how
Elaborating on the point further, Sarah (Female, 22, promethazine) we’re all comfortable talking like to chatbots and chatbots are our life.
suggested that the best counsellors are likely to be those with lived ex- Who knows how we’ll feel in 20 years time?
perience of mental health concerns themselves (which chatbots could Counsellor 5: It’s certainly not inconceivable that you can develop a re-
not have): lationship with a program.

But just the whole empathy and understanding, a robot’s not got depres- In this focus group interaction, counsellors discuss a future where
sion. It doesn’t suffer from hormonal imbalances through its life to know relationships between humans and non-humans (e.g., chatbots) may
what’s going on. […] I know a lot of my psych tutors, they’ve always further materialise through technological developments. Although chat-
been, yeah, the best. Usually, some of the best people in the psych, in bots were viewed as not affording empathic care in the present, partic-
A. Barnett, M. Savic and K. Pienaar et al. International Journal of Drug Policy 94 (2021) 102910

ipants suggested this may be overcome in the future if chatbots were of ‘more-than-human’ (Dennis, 2019) care networks, participants’ ac-
to become further ‘humanised’ – a vision that retains the centrality of counts suggest that the use of AI technologies could help to distribute
human forms of empathy as measures of quality care. counselling workloads by enabling counsellors working in conjunction
with chatbots to provide more care to more people.
Providing ‘more-than-human’ care: Chatbots and the distribution of care It is also worth reflecting on Phil’s reference to chatbots needing
to be “well-qualified and well-trained”. At face value, describing the
Although many participants expressed the view that if chatbots re- need for chatbots to have ‘qualifications’ is a rather odd formulation.
placed human counsellors, online care would lack empathy, most agreed However, perhaps Phil’s account can best be explained by the blurring
that care provided by human and chatbot hybrid models could poten- of boundaries between human and non-human actors such as chatbots:
tially afford more efficient care. Many participants expressed the view indeed, we normally associate obtaining qualifications as a human ac-
that chatbots could be used as a tool to perform certain tasks and supple- tivity, the outcome of human learning and the development of specific
ment, rather than replace, human-centred care. For example, Joe (Male, skills and competencies. As Haraway observed, the “leaky distinction”
25, methamphetamine and diazepam) stated: (Haraway, 2006, p. 120) between human and machine is increasingly
apparent in modern medicine becoming populated by cyborgs (actors
[A chatbot] has positives and negatives and if people go down the right
that are simultaneously human and machine, for example, the surgeon
path and use it for purposes more based on using it as a tool rather than
with a robot-assisted surgical tool). In our participant accounts, we ob-
taking the human aspect out, I think it would have a lot more of a positive
serve both pessimism and optimism regarding the effects of “complex
input than using it to get rid of people.
hybridization” (Haraway, 2006, p. 144) within future healthcare. The
A number of participants gave specific examples of how chatbots conjoint work of humans and chatbots is perceived as having a double-
could be used as a tool in online care. For instance, John (Male, 61, al- edged potential to constrain empathic care, but also to increase the ef-
cohol) mentioned chatbots could be helpful “maybe as a referral thing” ficiency and reach of care delivered online.
if they could assess a client’s needs and provide phone numbers or refer-
rals to a service. Sarah (Female, 22, promethazine) also viewed chatbots The potential affordances of chatbots: Minimising human error and
as being able to provide referrals: maximising expertise
[…] like if [a chatbot] flagged the word ‘anxiety’, and it went: ‘Oh, here’s
In addition to more efficient care, some participants suggested that
a link to a really good video on how to deal with anxiety.’ I think that
chatbots could help to overcome certain limitations of human counsel-
would be good, if they had that kind of function.
lors by, for example, minimising human error. For Sarah (Female, 22
Counsellors also identified a number of simple functions that chat- promethazine), thinking about previous encounters with human health
bots could perform, highlighting their affordances for online counselling professionals led to a humorous reflection that she wished she was talk-
in terms of information provision and triaging: ing to a robot:
Facilitator 1: Could you see any positive aspects or areas where that could Sarah: I don’t think – yeah, I don’t think [chatbots] would work as well...
be useful in terms of online sessions? But then you also get really shit counsellors, though, where you wish that
Counsellor 1: Maybe, only for information-giving — literally only for you were talking to a robot, so I don’t know.
information-giving. Facilitator: [Laughs] Yes.
Counsellor 2: Yeah, [chatbots could provide] counselling online for some- Sarah: I’ve had maybe 20 psychologists in my life, either court appointed
body to jump on and just the first question is: or any of that kind of thing. You get the good and you get the bad [laughs].
‘Do you have an alcohol or drug issue? Yes? Okay, stay on the chat!’
Rather than inevitably materialising in particular ways, Sarah’s
Because of [the name of the service being] Counselling Online so many
quote indicates that chatbots (or human counsellors for that matter) may
people jump on and they don’t actually have an alcohol or drug issues
afford different possibilities for care in different situations — sometimes
[…]
“good”, sometimes “bad”, but different depending on how chatbots or
Facilitator 2: Sort of filtering?
counsellors came together within networks of interacting actors within
Counsellor 2: Yeah.
care encounters.
Elaborating on this concept, Phil (Male, 38, synthetic cannabis) dis- In another account, Lucas (Male, Age not available, alcohol) ex-
cussed how hybrid human and chatbot systems could effectively deliver plained that chatbots in online care may reduce human errors, thus af-
online interventions: fording a higher standard of care:
Phil: I actually think that as long as [chatbots are] well-setup and well- If you’ve got a chatbot talking to you which is highly intelligent and well-
qualified and well-trained, but also that there’s a good level of human programmed compared to a human – chatbot, I – for me, maybe there’s an
handoff, where humans can get involved very quickly and easily, I actu- advantage in the chatbot. Yeah, maybe, maybe because of human error
ally can see places where they would be useful in counselling. Things like [laughs] and the like, yeah.
maybe where you’re just – where literally it’s just asking for check-ins:
Spencer’s (Male, 24, cannabis) discussion of chatbots affording in-
‘Have you used [drugs] today, yesterday?’ that kind of thing…
creased expertise revealed a sense that individual humans may be lim-
‘Have you felt these things?’
ited in their knowledge and ability compared to a future chatbot. A
[…] This is the domain the chatbots are useful for. Then I can see how
chatbot could be programmed with a greater knowledge or evidence
they could at least do some of the legwork and do a little bit of the back-
base from which to work. In regards to chatbots, Spencer stated:
ground before it went to a human. I can only see them as being a tool used
by human counsellors rather than in any way taking over interactions. [Having chatbots delivering care] means the collective knowledge of say
Facilitator: It sounds like there’s a limit to how much they could provide? 50 counsellors could help one person as opposed to the full knowledge of
Phil: Yeah, but I could see how they could at least shoulder a little bit one.
of the workload of gathering basic information and sticking it into some
Here, Spencer highlights that a future affordance of chatbots is their
kind of metadata repository or database and an interaction log, doing a
potential to draw on multiple knowledges about alcohol and other drug
bit of that kind of initial data collection.
use and addiction to inform counselling practice. A chatbot’s ability to
These accounts point to various possibilities for human/chatbot hy- draw on multiple ontologies of addiction might provide people experi-
bridised systems to deliver more efficient online care. In the context encing alcohol and other drug concerns with different options for un-
A. Barnett, M. Savic and K. Pienaar et al. International Journal of Drug Policy 94 (2021) 102910

derstanding their experiences, as well as offering a range of treatment Dennis, 2019; Duncan et al., 2019). In these examples, this includes the
options. In this way, tailoring care to a client’s needs may have clini- ability of a human counsellor to understand clients’ concerns, and more-
cal benefits (Savic & Lubman, 2018). However, whilst this might make over, make them feel comfortable and safe to discuss sensitive topics.
for holistic, tailored care, it is unclear whether offering a broad spec- However, whilst the ability of human counsellors to “read between the
trum of interventions (e.g., abstinence, harm reduction services) based lines” in online encounters is described as desirable, in certain situa-
on different ontologies of addiction (e.g., whether addiction is a disease, tions this could also be interpreted as jumping to conclusions or mak-
social problem) might contribute to confusion among clients, which in ing assumptions without asking people how they feel about or under-
turn could undermine treatment engagement (Barnett, Hall, Fry, Dilkes- stand their experiences. Given that people with alcohol and other drug
Frayne, & Carter, 2018). concerns tend to experience stigma in healthcare settings, inferring and
making assumptions about their concerns could also reinforce stigma,
Affordances and constraints for clients’ communication in interactions with or relegate a range of other possible explanations for alcohol and drug
chatbots consumption to the background.
While some participants viewed chatbots as potentially having a neg-
In addition to reflecting on the benefits of chatbots for online care, ative influence on client interactions, others suggested that chatbots may
clients and counsellors also expressed concerns about how the use of have a positive influence in online counselling contexts. For example,
chatbots may constrain counselling interactions. Some participants ex- Nicole (Female, 30, alcohol) remarked that she wouldn’t have to temper
pressed the view that if they interacted with a chatbot, they would be her behaviour in order to not offend a human:
less likely to be open and honest in comparison to speaking with a
Facilitator: Yep. Any positive aspects at all of a chatbot?
human counsellor. Rob (Male, 30, alcohol) suggested he would share
Nicole: You definitely don’t have to worry about offending anybody!
"way less" if he was communicating with a chatbot. In another example,
Rachel (Female, 22, methamphetamine) described how interacting with In another example, Tim (Male, 76, alcohol) noted the privacy that
a chatbot might also limit the information she disclosed during therapy: chatbots may afford could lead to higher rates of treatment engagement
in certain cases:
Facilitator: Yeah, okay. Let’s say, for example, in some futuristic world,
I guess, that they could program a machine to actually talk, I guess like Facilitator: Do you think there would be any positives, maybe not for
a human […] How would that affect the way you interact and share yourself, but for other people about this more computerised response or
information? anything?
Rachel: I wouldn’t. Yeah, I just wouldn’t. But, okay, let say if in the futur- Tim: I can imagine people wanting to try [speaking to chatbots] for rea-
istic world, [chatbots could]. I certainly wouldn’t be as open, I certainly sons of extreme privacy and confidentiality.
– I wouldn’t feel comfortable enough to be open, at all.
These accounts illustrate that for different types of clients, with dif-
In another account, Nicole (Female, 30, alcohol) expressed the view ferent care needs (e.g., those wanting human contact, or seeking pri-
that while she would not necessarily share less with a chatbot, the na- vacy), the affordances of chatbots and how these may influence client
ture and content of the information shared would be different. This was conduct as part of a human-computer interaction (Norman, 1988) was
because, for Nicole, chatbots lack the capacity for inferential thinking, variable. Rather than being a pre-programmed effect designed by hu-
meaning that any information she shared would have to be curated and mans, affordances were seen to emerge as a result of human and non-
simplified: human encounters in online care spaces.

Facilitator: So would the way that you would potentially interact and
Discussion: Reflecting on ‘more-than-human’ futures and online
share information with a chatbot – would that I guess be different from
care
the way you’d interact and share information with a human counsellor?
Nicole: I think so. I think because I would – if I knew I was talking to a
When reflecting on the use of chatbots in online care, some partic-
chatbot, I would have to make sure I included the details and the phrases
ipants drew on their everyday experiences of encountering chatbots,
that I wanted feedback on. Whereas, the inferential capacity of a human
while others drew on dystopian, futuristic imaginaries informed by pop-
to go, ‘I think what you’re saying, even though you’re not saying it, is that
ular cultural depictions of machines and their effect on society. These
you are really stressed.’ Or: ‘That you are really frustrated and angry.’
occasional references to dystopian futures point to a fear of AI technolo-
Rather than a computer going: ‘Oh it sounds like you are thirsty and
gies supplanting humans among some, but not all, participants.
that’s why you are drinking.’ So you need [laughs] – I think you would
A strong theme in our analysis was the concern that the growing use
have to like – you would have to know the answers to your questions
of chatbots in online settings could lead to the replacement of human
and just want somebody else to say them. Whereas, yeah, when you’re
counsellors and undermine the kind of empathic care considered espe-
approaching a human, you don’t have to know and you don’t have to –
cially important for effective online counselling. Moreover, the ‘human
you can be more honest […] Yeah, you don’t have to disclose everything
element’ was viewed as an essential part of care that should be main-
either. Because a human can read between the lines.
tained. Viewed in light of recent critical work on ‘care’ (Puig de la Bel-
Similarly, the ability of human counsellors to “read between the lacasa, 2017; Dennis, 2019; Duncan et al., 2019; Farrugia et al., 2019;
lines” was also mentioned by a counsellor in a focus group. In this ac- Mol, 2008), our participants’ concerns raise important ethico-political
count, counsellors reflected on sensitive issues raised in online coun- questions about the future of online care. Specifically, if the affective
selling and how chatbots may not be able to address clients’ needs: practices definitional to care, that is, care characterised by empathy and
mutual understanding, are undermined or foreclosed by the use of chat-
Counsellor 1: [Talking about issues including] grief or domestic violence,
bots in online counselling, exactly what type of ‘care’ materialises in
that’s something that as humans we can probably acknowledge and then
these settings? If it is a perfunctory form of ‘service delivery’, is it the
possibly contain the conversation. But I’m not sure how a robot would do
type of ‘care’ that alcohol and other drug digital interventions should
the same thing.
afford? ‘Service delivery’ in this impoverished form risks entrenching,
Counsellor 3: Yeah.
rather than alleviating, the problems encountered in face-to-face set-
Counsellor 2: Yeah, [or be] able to read between the lines and those
tings and thus undermining the goals of digital healthcare to reach a
things.
wider audience, and to overcome barriers such as stigma and low rates of
While not the only element of care, Nicole and the counsellors’ ac- treatment-seeking (Silva, Rodrigues, de la Torre Díez, López-Coronado,
counts reiterate the importance of the affective dimension of care (see & Saleem, 2015).
A. Barnett, M. Savic and K. Pienaar et al. International Journal of Drug Policy 94 (2021) 102910

Beyond these fears about the application of chatbots in alcohol and Funding
other drug treatment futures, most participants were receptive to the
possibility of humans and chatbots working in unison to perform sim- AC receives an NHMRC Career Development Fellowship (No.
ple, basic data-gathering or repetitive tasks so that human intelligence APP1123311). This project was funded through the Monash University,
and agency might be freed up to engage in more complex activities. For Faculty of Arts and Faculty of Medicine, Nursing and Health Sciences
example, AI technologies could be used to collect basic demographic in- Interdisciplinary Research Scheme.
formation, take client histories, and triage and refer clients to appropri-
ate services, resources or health professionals. In this way, participants Declarations of Interest
in this study expressed greater acceptance of hybridised-care delivery,
where the non-human in the form of the chatbot supported the human None.
to provide care and services.
Some participants raised concerns that interacting with a chatbot Acknowledgements
may constrain open and honest communication between the client and
chatbot. Others suggested a number of practical benefits of interacting We express our thanks to the clients and counsellors who gave their
with a chatbot, such as the increased assurance of clients’ privacy and time to participate in this study, and to the anonymous reviewers and
confidentiality. Situating our work within scholarship on technologi- editor(s) for their constructive feedback on our work. We also acknowl-
cal affordances (Gibson, 1979; Hutchby, 2001; Latour & Venn, 2002; edge the support of managers from the Counselling Online service, es-
Norman, 1988), our analysis suggests that chatbots are perceived as pecially Rick Loos, Orson Rapose, and Darryl Jones.
‘furnishing’ (Gibson, 1979) clients and counsellors different opportu- References
nities in different circumstances. Thus, the affordances that chatbots
may offer clients who access online care, are not fixed, stable or in- Barnett, A., Dilkes-Frayne, E., Savic, M., & Carter, A. (2018). When the brain leaves the
built features of the technologies themselves. Rather, depending on the scanner and enters the clinic: The role of neuroscientific discourses in producing the
problem of “addiction”. Contemporary Drug Problems, 45(3), 227–243.
context, desires and actions of humans interacting with chatbots, the
Barnett, A., Hall, W., Fry, C. L., Dilkes-Frayne, E., & Carter, A. (2018). Implications of treat-
human-computer interaction has the potential to emerge in multiple ment providers’ varying conceptions of the disease model of addiction: A response.
ways (Latour & Venn, 2002; Norman, 1988). Drug and Alcohol Review, 37(6), 729.
Bibault, J.-E., Chaix, B., Nectoux, P., & Brouard, B. (2019). Healthcare ex machina: Are
Recognising the complexities of delivering online care as evidenced
conversational agents ready for prime time in oncology? Clinical and Translational
in our participants’ accounts, we have proposed a ‘more-than-human’ Radiation Oncology, 16, 55–59.
model of care, attuned both to the importance of traditional (human) Boyd, J., Lavalley, J., Czechaczek, S., Mayer, S., Kerr, T., Maher, L., & McNeil, R. (2020).
modes of care and to the affordances of AI-driven technologies. We sug- “Bed bugs and beyond”: An ethnographic analysis of North America’s first wom-
en-only supervised drug consumption site. International Journal of Drug Policy, 78,
gest such a model has the potential to disrupt outmoded, anthropocen- Article 102733.
tric framings of care that overly rely on individual treatment providers, Brahnam, S., & De Angeli, A. (2012). Gender affordances of conversational agents. Inter-
to the exclusion of forms of care that emerge at the intersection of hu- acting with Computers, 24(3), 139–153.
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research
man counsellors and non-human technologies. Embracing a ‘more-than- in Psychology, 3(2), 77–101.
human’ approach opens the way for care provision to be distributed Budney, A. J., Borodovsky, J. T., Marsch, L. A., & Lord, S. E. (2019). Technological inno-
among human and non-human actors while recognising the continuing vations in addiction treatment. In I. Danovitch, & L. Mooney (Eds.), The Assessment
and Treatment of Addiction (pp. 75–90). Los Angeles, CA: Elsevier.
need for traditional counselling and also maximising the affordances Crutzen, R., Peters, G.-J. Y., Portugal, S. D., Fisser, E. M., & Grolleman, J. J. (2011). An
and agency of technological actors (e.g., AI-driven chatbots). The distri- artificially intelligent chat agent that answers adolescents’ questions related to sex,
bution of care across the more-than-human spectrum has the potential drugs, and alcohol: an exploratory study. Journal of Adolescent Health, 48(5), 514–519.
Denecke, K., Tschanz, M., Dorner, T. L., & May, R. (2019). Intelligent conversational agents
to support counsellors to provide high quality care to more people in
in healthcare: Hype or hope? Studies in Health Technology and Informatics, 259, 77–84.
need — an issue that is of particular salience for alcohol and other drug Dennis, F. (2017). The injecting ‘event’: Harm reduction beyond the human. Critical Public
counselling in Australia where unmet demand persists (Ritter, Chalmers Health, 27(3), 337–349.
Dennis, F. (2019). Injecting Bodies in More-than-Human Worlds: ediating Drug-Body-World
and Gomez, 2018).
Relations. New York, NY: Routledge.
Beyond viewing chatbots as a technological solution, it is vital Dilkes-Frayne, E., Savic, M., Carter, A., Kokanović, R., & Lubman, D. I. (2019). Going
that policymakers formulating and implementing future technological online: the affordances of online counseling for families affected by alcohol and other
change consider the social effects of digital health technologies. Ad- drug issues. Qualitative Health Research, 29(14), 2010–2022.
Duncan, T., Sebar, B., Lee, J., & Duff, C. (2019). Mapping the spatial and affective compo-
dressing users’ perceptions of the benefits and limits of digital health sition of care in a drug consumption room in Germany. Social & Cultural Geography,
interventions such as chatbots and apps is vital to inform the design 1–20.
and deployment of new technologies. When designing digital health in- Farrugia, A., Neale, J., Dwyer, R., Fomiatti, R., Fraser, S., Strang, J., & Dietze, P. (2019).
Conflict and communication: managing the multiple affordances of take-home nalox-
terventions, to only focus on quantitative outcome measures, such as one administration events in Australia. Addiction Research & Theory, 1–9.
whether an intervention increases rates of recovery or reduces rates of Foley, T., & Woollard, J. (2019). The digital future of mental healthcare and
relapse, precludes consideration of the ways clients and counsellors in- its workforce: A report on a mental health stakeholder engagement to inform
the Topol Review. Retrieved from https://topol.hee.nhs.uk/wp-content/uploads/
teract with new technologies, and the effects these interactions generate. HEE-Topol-Review-Mental-health-paper.pdf.
For example, in our own work, we see how clients and counsellors have Fraser, S., & Moore, D. (2011). Governing through problems: The formulation of policy on
concerns that care delivered online by chatbots may lack the empathy amphetamine-type stimulants (ATS) in Australia. International Journal of Drug Policy,
22(6), 498–506. 10.1016/j.drugpo.2011.09.004.
of a traditional, therapeutic relationship, thus potentially limiting the
Fraser, S., Treloar, C., Gendera, S., & Rance, J. (2017). ‘Affording’new approaches to cou-
quality of care delivered. A broadening of evaluation and ‘evidencing’ ples who inject drugs: A novel fitpack design for hepatitis C prevention. International
methods, that also takes into account the social implications of novel Journal of Drug Policy, 50, 19–35.
Gibson, J. (1979). The ecological approach to visual perception. Boston, MA, US: Houghton,
digital health interventions as they emerge in local implementation situ-
Mifflin and Company.
ations, is vital to inform future digital health development (Murray et al., Gratzer, D., & Goldbloom, D. (2020). Therapy and e-therapy—preparing future psychia-
2016; Rhodes & Lancaster, 2019; Savic et al., 2018). Moreover, rigor- trists in the era of apps and chatbots. Academic Psychiatry, 1–4.
ous, critical research is needed into the social and political dimensions Groves, C. (2017). Emptying the future: On the environmental politics of anticipation.
Futures, 92, 29–38.
of ‘more-than-human’ alcohol and other drug interventions to minimise Haraway, D. (2006). A cyborg manifesto: Science, technology, and socialist-feminism in
any damaging or counterproductive effects, and maximise the potential the late 20th century. In The International Handbook of Virtual Learning Environments
benefits of these new modes of care. (pp. 117–158). Dortrecht: Springer.
A. Barnett, M. Savic and K. Pienaar et al. International Journal of Drug Policy 94 (2021) 102910

Home Office. (2013). FRANK: free practical drug advice for adults and children. Retrieved Nadarzynski, T., Miles, O., Cowie, A., & Ridge, D. (2019). Acceptability of artificial intel-
from https://www.gov.uk/government/publications/frank. ligence (AI)-led chatbot services in healthcare: A mixed-methods study. Digital Health,
Hutchby, I. (2001). Technologies, texts and affordances. Sociology, 35(2), 441–456. 5(1), 1–12.
Lancaster, K., Seear, K., Treloar, C., & Ritter, A. (2017). The productive techniques and Nguyen, M. (2020). How artificial intelligence and machine learning produced
constitutive effects of ‘evidence-based policy’and ‘consumer participation’discourses robots we can talk to. Retrieved from https://www.businessinsider.com/
in health policy processes. Social Science & Medicine, 176, 60–68. chatbots-talking-ai-robot-chat-machine?r=AU&IR=T.
Laranjo, L., Dunn, A. G., Tong, H. L., Kocaballi, A. B., Chen, J., Bashir, R., & Norman, D. A. (1988). The psychology of everyday things. US: Basic books.
Lau, A. Y (2018). Conversational agents in healthcare: a systematic review. Journal of Pienaar, K., & Savic, M. (2016). Producing alcohol and other drugs as a policy ‘problem’:
the American Medical Informatics Association, 25(9), 1248–1258. A critical analysis of South Africa’s ‘National Drug Master Plan’ (2013–2017). Inter-
Latour, B., & Venn, C. (2002). Morality and technology. Theory, Culture & Society, 19(5–6), national Journal of Drug Policy, 30, 35–42.
247–260. Puig de La Bellacasa, M. P (2017). Matters of care: Speculative ethics in more than human
Laumer, S., Maier, C., & Gubler, F. T. (2019). Chatbot acceptance in healthcare: explaining worlds. Minneapolis, MN: The University of Minnesota Press.
user adoption of conversational agents for disease diagnosis. Paper presented at the Rhodes, T., & Lancaster, K. (2019). Evidence-making interventions in health: A conceptual
Proceeding of the 27th European Conference on Information System (ECIS). framing. Social Science & Medicine, 238, Article 112488.
Martínez-Miranda, J. (2017). Embodied conversational agents for the detection and pre- Rizzo, A., Shilling, R., Forbell, E., Scherer, S., Gratch, J., & Morency, L.-P. (2016). Au-
vention of suicidal behaviour: current applications and open challenges. Journal of tonomous virtual human agents for healthcare information support and clinical in-
Medical Systems, 41(9), 135. terviewing. In Artificial Intelligence in Behavioral and Mental Health Care (pp. 53–79).
Martin, A., Myers, N., & Viseu, A. (2015). The politics of care in technoscience. Social Cambridge, MA: Academic Press.
Studies of Science, 45(5), 625–641. Savic, M., & Lubman, D. I. (2018). An argument against the implementation of an “over-
McCorduck, P. (2004). Machines who think: A personal inquiry into the history and prospects arching universal addiction model” in alcohol and other drug treatment. Drug and
of artificial intelligence. Boca Raton, FL: CRC Press. Alcohol Review, 37(6), 721–722.
Mol, A. (2008). The logic of care: Health and the problem of patient choice. Abingdon, Oxon: Savic, M., Ferguson, N., Manning, V., Bathish, R., & Lubman, D. I. (2017). “What consti-
Routledge. tutes a ‘problem’?” Producing ‘alcohol problems’ through online counselling encoun-
Morency, L.-P., Stratou, G., DeVault, D., Hartholt, A., Lhommet, M., Lucas, G., & ters. International Journal of Drug Policy, 46, 79–89.
Gratch, J (2015). SimSensei demonstration: a perceptive virtual human interviewer Savic, M., Dilkes-Frayne, E., Carter, A., Kokanovic, R., Manning, V., Rodda, S. N., & Lub-
for healthcare applications. Paper presented at the Twenty-Ninth AAAI Conference on man, D. I. (2018). Making multiple ‘online counsellings’ through policy and practice:
Artificial Intelligence. an evidence-making intervention approach. International Journal of Drug Policy, 53,
Müller, V. C., & Bostrom, N. (2016). Future progress in artificial intelligence: A survey of 73–82.
expert opinion. In Fundamental Issues of Artificial Intelligence (pp. 555–572). Switzer- Silva, B. M., Rodrigues, J. J., de la Torre Díez, I., López-Coronado, M., & Saleem, K. (2015).
land: Springer. Mobile-health: A review of current state in 2015. Journal of Biomedical Informatics, 56,
Murphy, M. (2015). Unsettling care: Troubling transnational itineraries of care in feminist 265–272.
health. Social Studies of Science, 45(5), 717–737. World Health Organisation. (2020). WHO launches a chatbot on facebook messenger
Murray, E., Hekler, E. B., Andersson, G., Collins, L. M., Doherty, A., Hollis, C., . . . Wy- to combat COVID-19 misinformation. Retrieved from https://www.who.int/
att, J. C (2016). Evaluating digital health interventions: key questions and approaches. news-room/feature-stories/detail/who-launches-a-chatbot-powered-facebook-
American Journal of Preventive Medicine, 51(5), 843–851. messenger-to-combat-covid-19-misinformation.

You might also like