Human-Data Interaction
Human-Data Interaction
Human-Data Interaction
Introduction
We have moved from a world where computing is siloed and specialised, to
a world where computing is ubiquitous and everyday. In many, if not most,
parts of the world, networked computing is now mundane as both
foreground (e.g., smartphones, tablets) and background (e.g., road traffic
management, financial systems) technologies. This has permitted, and
continues to permit, new gloss on existing interactions (e.g., online
banking) as well as distinctively new interactions (e.g., massively scalable
distributed real-time mobile gaming). An effect of this increasing
pervasiveness of networked computation in our environments and our lives
is that data are also now ubiquitous: in many places, much of society is
rapidly becoming “data driven”.
Many of the devices we use, the networks through which they connect – not
just the Internet but also alternative technologies such as fixed and cellular
telephone networks – and the interactions we experience with these
technologies (e.g., use of credit cards, driving on public highways, online
shopping) generate considerable trails of data. These data are created both
consciously by us – whether volunteered via, e.g., our Online Social
Network (OSN) profiles, or observed as with our online shopping behaviour
(World Economic Forum 2011) – and they are inferred and created about us
by others – not just other people but, increasingly, machines and
algorithms, too.
Author/Copyright holder: tara hunt. Copyright terms and licence: CC BY-SA 2.0
We create[1] data trails both consciously and unconsciously. Most of us are very
self-conscious about what we post on our Facebook[2] and Twitter[3] accounts –
but only a minority of people are aware that they leave a very detailed data trail
in many other ways, e.g., when browsing online or walking in their cities
carrying their smartphones. Increasingly, machines and algorithms are tracking
every step we take – both online and offline – and we are very rarely “warned”
or notified about this surveillance.
We think that it’s crucial to understand 1) how our behaviours, 2) how the data
our behaviours generate, and 3) how the algorithms which process these data
increasingly shape our lives. Human-Data Interaction (HDI) places the human
at the centre of these data flows, and HDI provides mechanisms which can help
the individual and groups of people to interact explicitly with these systems and
data.
In this article we will next go into more detail as to why HDI deserves to be
named as a distinct problematic (§2) before defining just what it is we might
mean by HDI (§3). We will then give our story of the development of HDI to
its state by the mid-2010s, starting with Dataware, an early technical
attempt to enable HDI (§4). We follow this with a deeper discussion of what
exactly the “I” in HDI might mean – how interaction is to be construed and
constructed in HDI – and a recent second attempt at starting to define a
technical platform to support HDI with that understanding in mind (§5 and
§6 respectively). We conclude with a brief discussion of some exciting areas
of work occurring in the second half of the 2010s that we identify (§7),
though there are no doubt many more! Finally, after summarising (§8), we
give a few indications of where to go to learn more (§9).
Many Internet businesses rely on extensive, rich data collected about their
users, whether to target advertising effectively or as a product for sale to
other parties. The powerful network externalities that exist in rich data
collected about a large set of users make it difficult for truly competitive
markets to form. We can see a concrete example in the increasing range and
reach of the information collected about us by third-party websites, a space
dominated by a handful of players, including Google[9], Yahoo, Rubicon
Project, Facebook and Microsoft (Falahrastegar et al. 2014, 2016). This
dominance[10] has a detrimental effect on the wider ecosystem: online
service vendors find themselves at the whim of large platform and
Application Programming Interface (API) providers, hampering
innovation[11] and distorting markets.
Having discussed why HDI is a topic that should concern us, we now turn to
a more detailed discussion of just what it is that we might mean when we
use the term HDI.
In many ways, it is the interplay between the last three definitions that we
believe gives rise to the need for a broader conception of HDI: when the
data trails of individuals’ private behaviour are coalesced and analysed as
big data; and where the results of that analysis, whether or not correct, are
fed back into the data associated with an individual. Data, particularly
personal data, can be seen as a boundary object (Star and Griesemer 1989;
Star 2010), reflected in the many ways different communities refer to and
think of data. For example, to contrast[20] with big data we see data trails
referred to as small data (Estrin 2013) where “N = me”, pertaining to each of
us as individuals. We see yet other terms used in other fields: participatory
data (Shilton 2012) in health, microdata (Kum et al. 2014) in population
informatics, and digital footprint (Madden et al. 2007) in the digital
economy.
At the heart[25] of HDI lies three core principles[26]: legibility, agency and
negotiability, set out by Mortier et al. (2014):
Data created about us are often less well-understood by us. For instance,
third-party website tracking, when combined with recommender systems
and data-mining algorithms can create new data from inferences, such as
advertising preferences (Vallina-Rodriguez et al. 2012). Credit-scoring
companies and “customer science” companies collect and mine shopping
and transaction data to both predict and enable behaviours. Not all such
data uses are strictly commercial, however. For instance, personal data can
be used to generate data for new crowdsourced applications such as traffic
reports or optimised bus routes (Berlingerio et al. 2013). But new tools for
informing people about their data, and the practices used around these
data, are essential.
Data created by us arise from our interaction with numerous sensors and
technologies, from what are now mundane technologies such as OSNs and
websites. The richness and variety of such data, however, is continually
increasing, particularly with the growing interest in lifelogging and the
“Quantified Self” (Choe et al. 2014). For example, devices and sensors with
which we explicitly interact when monitoring our health (e.g., continuous
blood glucose monitoring, smart asthma inhalers, bathroom scales that
track our weight, or smartphone apps that monitor our sleep patterns).
Such devices can create “people-centric” sensor trails (Campbell et al.
2008). Related advances in portable medical sensors, affordable personal
genomics screening, and other tools for mobile health diagnosis will
generate new personal medical datasets (Kumar et al. 2013).
Legibility entails several features. First, we need to become aware that data
is being collected, relatively straightforward to achieve as with, e.g., recent
European legislation requiring that websites make clear to users when the
site deposits browser cookies. The second, more complex, requirement is
that we become aware of the data themselves and their implications. A
data-centric view of the world requires that we pay attention to the
correctness (in an objective knowledge sense) of data. In contrast, a human-
centric view requires that systems allow for different but equally valid
viewpoints of data. Similarly, interpretations of data may vary significantly
over time, hence (for example) the recent Court of Justice of the European
Union (2014) “right-to-be-forgotten” where public data about individuals
can be removed from search engine results so that the distant past is not
kept fresh in people’s minds, mirroring in some ways the natural human
behaviour of forgetting once topical information.
This is more than simply the ability to provide informed consent, though
even that is often not achieved (or was as of the mid-2010s) (Ioannidis
2013; Luger, Moran, and Rodden 2013; Luger and Rodden 2013). The data
collection process may have inherent biases due to contextual
dependencies, temporal and other sampling biases, and simply
misunderstood semantics. Inferences drawn from our personal data could
be wrong, whether due to flawed algorithms, incomplete data or the way our
attitudes and preferences change over time. User-centric controls are
required, not only for consent but for the revocation of collected personal
data (Whitley 2009).
In addition to a richer and more robust dialogue between regulators and the
industry, we believe that enabling these requires stakeholders, including
researchers, regulators, technologists, and industry, to establish qualitative
and quantitative techniques for understanding and informing activity
around human data. A survey of 1,464 UK consumers said that 94% believed
that they should be able to control information collected about them
(Bartlett 2012). It is worth noting that providing such abilities might also
bring benefits to data collection and processing organisations as well: the
same survey reported that 65% of respondents said that they would share
additional data with organisations “if they were open and clear about how
the data would be used and if I could give or withdraw permission”.
Note that we do not suggest all users must become continuously engaged in
the collection, management and processing of their personal data.
Extensive work in the context of privacy and personal data has
demonstrated such features as the privacy paradox, whereby privacy only
becomes a concern after a violation (Barnes 2006), and we might reasonably
anticipate that many people will not often need or desire the capacity to act
within these data-collection and -processing systems. However, many will
from time to time, and some enthusiasts may do so more frequently. We
claim that they must be supported in doing so.
It is also worth noting that not all activities associated with processing of
personal data are harmful, and so granting users agency in these systems
need not have only negative effects. Recommender systems (Ricci et al.
2010) can provide a useful function, saving us time and effort. Live traffic
updates through services such as Google Maps assist us in avoiding traffic
jams. Public health initiatives are often based on the aggregation of large
quantities of highly personal data. The opportunity for data subjects to
engage with data systems may enable them to correct and improve the data
held and the inferences drawn, improving the overall quality and utility of
the applications using our personal data.
Much debate around the use of personal data has assumed that data are
considered a “good” that can be traded and from which economic value
should be extracted (Organisation for Economic Co-operation and
Development 2013). Although we agree that it may well be possible to
enable an ecosystem using economic value models for utilisation of
personal data and marketplaces (Aperjis and Huberman 2012), we believe
that power in the system is—as of 2016—disproportionately in favour of the
data aggregators that act as brokers and mediators for users, causing the
apparent downward trajectory of economic value in the information age
(Lanier 2013).
Some of these issues are already being faced by researchers carrying out
experiments that use personal data. Experiment[38] design requires careful
consideration of the types of data to be used and the ways in which
appropriate consent to use data can be obtained (Brown, Brown, and Korff
2010). Sharing of research data is becoming popular, and even mandated, as
a mechanism for ensuring good science and the dissemination of good
science (Callaghan et al. 2012). As a result, issues such as the privacy and
ethics issues of sharing – and not sharing (Huberman 2012) – data are
increasingly being discussed (O’Rourke et al. 2006).
Having discussed just what we might mean by HDI, we now turn the clock
back to an early exploration of technical matters that informed the
development of HDI. This provides a basis for the direction in which HDI
has moved and for its current trajectory.
Dataware: HDI v0
The Dataware model of McAuley, Mortier, and Goulding (2011) was a very
early foray into providing a particular instantiation of what later became
core HDI concepts. The model is based on three fundamental types of
interacting entity, depicted in Figure 2: the owner (or user or subject), by or
about whom data is created; the data sources, which generate and collate
data; and the data processors, which wish to make use of the user’s data in
some way.
Author/Copyright holder: Richard Mortier. Copyright terms and licence: CC BY-
NC-ND
Figure 2: Actors within the Dataware model: owner (or user or subject), sources,
and processors, interaction among whom is mediated through the owner’s
personal container.
Take, for example, a young child’s personal data – who owns it and who
controls it? It cannot be assumed that the same person exercises ownership
and control. Ownership may well reside with the person to whom the data
applies, as it were. However, control in such a situation may well be
delegated to another (e.g., a parent), thereby reflecting the organised
practices of personal data handling (take, for example, a young child’s
health records or bank details). The same does not apply to teenagers,
however. As they develop their independence we might well expect, again
in line[44] with current organised practices of human data interaction, that
they will assume control over their own data along with a great many other
aspects of their lives. Even so, this may be a phased rather than a sharp
transition. The same may apply, in reverse[45], to an elderly member of the
cohort who wishes to hand[46] over the running of her affairs to someone
else. Situated within a lively social context, and accompanied by differing
relational rights and obligations, ownership and control cannot be
permanently fixed and tied to an individual, as the Dataware model
presumes. Instead, it will change over time with respect[47] to a host of
evolving relationships and contingencies.
The same applies to the field of work itself. Schmidt points out that the
distributed activities of a cooperative work arrangement are articulated
with respect to objects within the field of work itself (e.g., data sources
within the catalogue). A key issue here revolves around the ‘conceptual
structures and resources’ that order the field of work, enabling members of
a cooperative ensemble to make sense of it and act upon it. Again the
question of interactional adequacy arises when we ask what conceptual
structures HDI provides? It’s not that it doesn’t provide any, but the terms
on which it does so are problematic from an interactional perspective.
Take, for example, the Dataware catalogue. It is conceptually ordered in
terms of ‘tables’ that render data sources intelligible in terms of accounts,
applications, installs, and services, etc. The problem in this is that the
conceptual structure of HDI as instantiated in Dataware is rendered in
terms of the underlying technology, rather than in terms of what is being
done through that technology, such as the processing of biological data as
part of a healthcare regime. The problem thus involves ordering the field of
work such that it reflects the work-being-done, or the work-to-be-done,
rather than the underlying technical components of that work. It is hard to
see, then, how users can articulate their distributed activities with respect
to objects in the field of work when those objects (data sources) lack
legibility or intelligibility to the broader populace in contrast to computer
scientists and software engineers. Other, more ‘user friendly’ – and more
pointedly, data-relevant and service-specific – conceptual structures and
resources are required.
The issue of how users might drive the discovery process (finding data
processors for themselves, whether for personal, financial or social
purposes) is more problematic. We will soon discuss early thoughts on how
this might be addressed (§6), and turn upon making discovery of data
processors much like discovering new apps in app stores. Users are familiar
with and make a conscious choice to visit app stores, where they are
provided with rich metadata about apps and app authors that shapes their
decision-making. Data processors could be ‘vetted’, much like apps in the
iTunes Store, and progressively more detailed information about processing
could be provided, much like app permissions in the Google Play[51] Store.
In addition, the social aspects of app stores also play an important role in
the discovery process: user ratings and social networking[52] links help
build the trust between users and service providers that is essential in the
discovery and adoption of new technologies.
As part of this, it is key that users are not only able to visualise and inspect
the data held by a source, but that they can also visualise and thus
understand just what a data processor wants to take from a source or
collection of sources and why – that just what is being ‘shared’ is
transparently accountable to users, which may also involve making external
data sources (e.g., consumer trends data) visible so that users understand
just what is being handed over. Coupled to this is the need to enable
recipient design by users. There are two distinct aspects to this. One
revolves around enabling users to edit data, redacting aspects of the data
they do not wish to make available to others both within a cohort and
outside of it. The other revolves around controlling the presentation of data
to processors when the accuracy of data needs to be guaranteed (e.g.,
energy consumption readings).
In summary, the challenges of articulating personal data within HDI are not
settled matters. Rather, they open a number of thematic areas for further
investigation, elaboration and support:
Personal data discovery, including meta-data publication, consumer
analytics, discoverability[53] policies, identity mechanisms, and app
store models supporting discovery of data processers.
Personal data ownership and control, including group management of
data sources, negotiation, delegation and transparency/awareness
mechanisms, and rights management.
Personal data legibility, including visualisation of what processors would
take from data sources and visualisations that help users make sense of
data usage, and recipient design to support data editing and data
presentation.
Personal data tracking, including real time articulation of data sharing
processes (e.g., current status reports and aggregated outputs), and data
tracking (e.g., subsequent consumer processing or data transfer).
Databox: HDI v1
Dataware focused on a computational model for processing of personal data
– by moving code to data, the problems associated with release of data to
third parties could be avoided. However, it failed to consider in any detail
the numerous interactional challenges identified through consideration of
the HCI literature and the concepts of boundary object and articulation
work, discussed in the preceding section. Informed by that consideration,
our current work related to HDI is concerned with development of
infrastructure technology to provide for HDI in supporting individuals (in
the first instance) in management of their personal data. This effort refines
the initial concept of a cloud-hosted, online Personal Container into a
Databox (Haddadi et al. 2015). Your Databox is a physical device, supported
by associated services, that enables you to coordinate the collection of your
personal data, and to selectively and transiently make those data available
for specific purposes. Different models are supported that will enable you to
match your data to such purposes, from registration with privacy-
preserving data discovery services so that data processors can find your
Databox and request from you access to data it holds, to app stores in which
you can search for data processing applications that you wish to provide
with access to your data via your Databox. Its physicality offers a range of
affordances that purely virtual approaches cannot, such as located, physical
interactions based on its position and the user’s proximity.
41.7.1 Accountability
The potential efficacy of HDI fundamentally turns upon opening the
Internet up as it were and making it accountable to users. What we mean by
this is that at the network layer, the Internet only really supports
accounting to the extent required for settlement between Internet Service
Providers (ISPs), such as counting the number of bytes exchanged over
particular network interfaces to enable usage-based billing. With the kinds
of intimate data the IoT is envisioned to make available, this low-level “bits
and bytes” accounting will be completely inadequate. It will be necessary to
surface what data devices generate, how that data is recorded and
processed, by whom, where it flows to, etc. This metadata must be made
visible to users to enable legibility, agency and negotiability without
infringing users’ privacy.
One possible approach might be to provide smarter home hubs that support
a range of interfaces and control points developed for specific purposes.
Another is to support users in building their own infrastructure to a far
greater extent than is possible today. Instead of relying on others (e.g.,
ISPs) to provide, configure and manage infrastructure to support users, we
might seek to make it straightforward for users to create their own
infrastructure services, configuring and managing facilities such as
firewalling, virtual private networks, DNS and other services.
41.7.3 Resilience
Resilience is a key ingredient in the mix between the Internet, personal
infrastructures, and IoT applications in critical domains, such as health and
well-being or smart-device energy management. In short, we might ask
what happens to such applications when the Internet goes down (e.g., when
the local access router dies or there is a problem at the local exchange)?
There is a critical need to build resilience into IoT infrastructures if we are
to rely upon applications in critical domains.
One possible solution is to build IoT infrastructure into the local physical
environment[58] – e.g., into the fabric of the home – to provide the
necessary fallback. This might be complemented by formal modelling
techniques to enable the “in house” management of complex networked
systems of “dumb” devices. That, in turn, raises the challenge of how users
are to understand such techniques and interact with them to ensure quality
of service and the ongoing protection of privacy in the face of contingency.
41.7.4 Identity
As Peter Steiner put it in a cartoon in The New Yorker (1993), “On the
Internet, nobody knows you’re a dog”. Identity touches all aspects of HDI
and requires that meaningful statements can be made about just who has
access to a user’s data. The Internet, being concerned with moving packets
between network interfaces, provides no inherent support for higher-level
expressions of identity. Application layer means of supporting identity do
exist – e.g., TLS client[59] certificates and PGP public keys – but they are
very complex to manage. Specific challenges here include how to ensure the
availability of the necessary “secrets” (keys, certificates) on all devices that
may be used to access relevant data; how to support the management of
data corresponding to multiple identities held by a user; and how to handle
the revocation of access.
Author/Copyright holder: Peter Steiner. Copyright terms and licence: Fair Use.
"On the Internet, nobody knows you're a dog" is an adage which began as a
cartoon caption by Peter Steiner and published by The New Yorker on July 5,
1993.
41.7.5 Dynamics
Devices generating data change context as they are shared between
individuals, and individuals change context as they move around in space
and time. Applications and services will come and go as well. Enabling users
to be aware of and to manage the dynamics of ongoing data processing –
who or what has access to which data, for which purposes, etc. – is a critical
challenge to the sustained harvesting of personal data. That ongoing data
harvesting will be dynamic and will potentially implicate multiple parties
(users and data consumers) also raises the challenge of understanding the
dialogues that are needed to sustain it; particularly the “work” these
dialogues need to support and how they should be framed, implemented
and maintained.
41.7.6 Collaboration
41.7.6 Collaboration
Systems developed to support personal data management typically focus on
the individual. But personal data rarely concerns just a single person. It is
far more common for sources of personal data to conflate information about
multiple individuals, who may have different views as to how personal it is.
For example, smart metering data gives a household’s energy consumption
in aggregate, and different household members may want that data to be
shared with data consumers at different levels of granularity. Supporting
the collaborative management and use of personal data is another critical
ingredient in the mix, all of which trades on making the data and data
processing legible and putting the mechanisms in place that enable users to
exercise agency and negotiability locally amongst their own cohorts as well
as globally.
“Remember when, on the Internet, nobody knew who you were?” is a play by
Kaamran Hafeez on the famous Steiner cartoon, also published in The New
Yorker, on February 16, 2015.
So, in such a complex and emerging field, what should you take away? The
cartoon above gives one key takeaway: the simple fact that we do live in a
complex, increasingly data-driven world, and this is the case whether or not
we understand or care. The aim of HDI as a research agenda is to bring this
fact to the fore, to provoke engagement from many parties to address the
challenges we believe this raises. We hope that the framing of these debates
as Human-Data Interaction, and the core principles we claim are at the
heart of HDI, will assist and encourage researchers in many fields –
including Computer Science, Law, Sociology, Statistics, Machine Learning
among many others – to engage with the challenges and opportunities
posed by our collective data driven future.
There have also been press articles which garnered some interest in their
comments sections, giving some small sampling of public responses to
privacy and HDI, e.g.,
41.10 Acknowledgements
This article grows out of work funded by several agencies including RCUK
grants Horizon Digital Economy Research (EP/G065802/1), Privacy By
Design: Building Accountability into the Internet of Things
(EP/M001636/1), CREATe (AH/K000179/1), Databox (EP/N028260/1) and IT
as a Utility Network+ (EP/K003569/1); and the EU FP7 User Centric
Networking grant No. 611001. As well as thanking the HDI community
(http://hdiresearch.org) for their ongoing engagement and input, we
particularly thank Kuan Hon, Yvonne Rogers[66], Elizabeth Churchill, Ian
Brown, Laura James, Tom Rodden, members of the QMUL Cognitive
Science[67] research group, and attendees at the IT-as-a-Utility Network+
Human-Data Interaction workshop (October 2nd, 2013) for their input.
41.11 References
Adams, Emily K., Mehool Intwala, and Apu Kapadia. 2010. “MeD-Lights:
a usable metaphor for patient controlled access to electronic health
records.” In Proceedings of the 1st ACM International Health
Informatics Symposium, 800–808. IHI ’10. Arlington, Virginia, USA:
ACM. isbn: 978-1-4503-0030-8. doi:10.1145/1882992.1883112.
Aperjis, Christina, and Bernardo A. Huberman. 2012. “A Market for
Unbiased Private Data: Paying Individuals According to their Privacy
Attitudes.” First Monday 17, nos. 5-7 (May). doi:10.5210/fm.v17i5.4013.
Barnes, Susan B. 2006. “A privacy paradox: Social networking[68] in the
United States.” First Monday 11, no. 9 (September 4).
doi:10.5210/fm.v11i9.1394.
Bartlett, Jamie. 2012. The Data Dialogue. London, UK: Demos,
September 14. isbn: 978-1-909037-16-8.
Berlingerio, Michele, Francesco Calabrese, Giusy Lorenzo, Rahul Nair,
Fabio Pinelli, and Marco Luca Sbodio. 2013. “AllAboard: A System for
Exploring Urban Mobility and Optimizing Public Transport Using
Cellphone Data.” In Machine Learning and Knowledge Discovery in
Databases, edited by Hendrik Blockeel, Kristian Kersting, Siegfried
Nijssen, and Filip elezn, 8190:663–666. Lecture Notes in Computer
Science. Berlin, Germany: Springer. doi:10.1007/978-3-642-40994-3
Bowers, John, and Tom Rodden. 1993. “Exploding the Interface:
Experiences of a CSCW Network.” In Proceedings of the INTERACT ’93
and CHI ’93 Conference on Human Factors[69] in Computing Systems,
255–262. CHI ’93. Amsterdam, The Netherlands: ACM. isbn: 0-89791-
575-5. doi:10.1145/ 169059.169205.
Brown, I., and B. Laurie. 2000. “Security against compelled disclosure.”
In Proc. IEEE ACSAC, 2–10. December. doi:10.1109/ACSAC.2000.898852.
Brown, Ian. 2014. “The Economics of Privacy, Data Protection and
Surveillance.” In Handbook on the Economics of the Internet, edited by
M. Latzer and J.M. Bauer. Cheltenham, UK: Edward Elgar Publishing.
Brown, Ian, Lindsey Brown, and Douwe Korff. 2010. “Using NHS Patient
Data for Research Without Consent.” Law, Innovation and Technology 2,
no. 2 (December): 219–258. issn: 1757-9961.
doi:10.5235/175799610794046186.
Cafaro, Francesco. 2012. “Using embodied allegories to design gesture
suites for human-data interaction.” In Proceedings of the 2012 ACM
Conference on Ubiquitous Computing, 560–563. New York, NY, USA:
ACM. isbn: 978-14503-1224-0. doi:10.1145/2370216.2370309.
Callaghan, Sarah, Steve Donegan, Sam Pepler, Mark Thorley, Nathan
Cunningham, Peter Kirsch, Linda Ault, et al. 2012. “Making Data a First
Class Scientific Output: Data Citation and Publication by NERC’s
Environmental Data Centres.” International Journal of Digital Curation
7, no. 1 (March 10): 107–113. issn: 1746-8256. doi:10.2218/ijdc.v7i1.218.
Campbell, Andrew T., Shane B. Eisenman, Nicholas D. Lane, Emiliano
Miluzzo, Ronald A. Peterson, Hong Lu, Xiao Zheng, Mirco Musolesi,
Kristf Fodor, and Gahng-Seop Ahn. 2008. “The Rise of People-Centric
Sensing,” IEEE Internet Computing 12, no. 4 (July): 12–21. issn: 1089-
7801. doi:10.1109/ mic.2008.90.
Card, Stuart K., Thomas P. Moran, and Allen Newell. 1983. The
psychology of human-computer interaction. Hillsdale, NJ, USA:
Lawrence Erlbaum Associates, February. isbn: 0898598591.
Choe, Eun K., Nicole B. Lee, Bongshin Lee, Wanda Pratt, and Julie A.
Kientz. 2014. “Understanding quantified-selfers’ practices in collecting
and exploring personal data.” In Proceedings of the SIGCHI Conference
on Human Factors in Computing Systems, 1143–1152. Toronto, ON,
Canada: ACM Press. isbn: 9781450324731.
doi:10.1145/2556288.2557372.
Clark, David D. 1995. “The Design Philosophy of the DARPA Internet
Protocols.” SIGCOMM Comput. Commun. Rev. (New York, NY, USA) 25,
no. 1 (January): 102–111. issn: 0146-4833. doi:10.1145/205447.205458.
Coles-Kemp, Lizzie, and Elahe K. Zabihi. 2010. “On-line privacy and
consent: a dialogue, not a monologue.” In Proceedings of the 2010
workshop on New security paradigms, 95–106. NSPW ’10. Concord, MA,
USA: ACM, September. isbn: 978-1-4503-0415-3.
doi:10.1145/1900546.1900560.
Court of Justice of the European Union. 2014. An internet search engine
operator is responsible for the processing that it carries out of personal
data which appear on web pages published by third parties. Judgment in
Case C-131/12, May 13.
Crabtree, Andy, and Richard Mortier. 2015. “Human Data Interaction:
Historical Lessons from Social Studies and CSCW.” In Proceedings of
European Conference on Computer Supported Cooperative Work[70]
(ECSCW). Oslo, Norway, September.
Dourish, Paul. 2004. “What We Talk About when We Talk About
Context.” Personal Ubiquitous Comput. (London, UK, UK) 8, no. 1
(February): 19– 30. issn: 1617-4909. doi:10.1007/s00779-003-0253-8.
Dwork, Cynthia. 2006. “Differential Privacy.” In Automata, Languages
and Programming, edited by Michele Bugliesi, Bart Preneel, Vladimiro
Sassone, and Ingo Wegener, 1–12. Berlin, Germany: Springer Berlin /
Heidelberg. doi:10.1007/11787006 1.
Elmqvist, Niklas. 2011. “Embodied Human-Data Interaction.” In
Proceedings of the CHI Workshop on Embodied Interaction: Theory and
Practice in HCI, 104–107. May.
Estrin, Deborah. 2013. small data, N=me, Digital Traces. Talk presented
at TEDMED 2013, Washington, DC, USA, April.
European Parliament. 2014. Legislative resolution of 12 March 2014 on
the proposal for a regulation of the European Parliament and of the
Council on the protection of individuals with regard to the processing of
personal data and on the free movement[71] of such data (General Data
Protection Regulation).
http://www.europarl.europa.eu/sides/getDoc.do?
type=TA&reference=P7TA-2014-0212&language=EN, March.
Falahrastegar, Marjan, Hamed Haddadi, Steve Uhlig, and Richard
Mortier. 2014. “The Rise of Panopticons: Examining Region-Specific
Third-Party Web Tracking.” In Proc. Traffic Monitoring and Analysis,
edited by Alberto Dainotti, Anirban Mahanti, and Steve Uhlig,
8406:104–114. Lecture Notes in Computer Science. Also as arXiv
preprint arXiv:1409.1066. Springer Berlin Heidelberg, April. isbn: 978-3-
642-54998-4. doi:10.1007/978-3-64254999-1 9.
2016. “Tracking Personal Identifiers Across the Web.” In Proceedings of
Passive and Active Measurement (PAM).
Fan, Chloe. 2013. “The Future of Data Visualization[72] in Personal
Informatics Tools.” In Personal Informatics in the Wild: Hacking Habits
for Health & Happiness CHI 2013 Workshops. ACM.
Grudin, Jonathan. 1990a. “Interface.” In Proceedings of the 1990 ACM
Conference on Computer-supported Cooperative Work, 269–278. CSCW
’90. Los Angeles, California, USA: ACM. isbn: 0-89791-402-3.
doi:10.1145/99332. 99360.
Grudin, Jonathan. 1990b. “The Computer Reaches out: The Historical
Continuity of Interface Design.” In Proceedings of the SIGCHI
Conference on Human Factors in Computing Systems, 261–268. CHI ’90.
Seattle, Washington, USA: ACM. isbn: 0-201-50932-6.
doi:10.1145/97243.97284.
Guha, Saikat, Alexey Reznichenko, Kevin Tang, Hamed Haddadi, and
Paul Francis. 2009. “Serving Ads from localhost for Performance,
Privacy, and Profit.” In Proceedings of the Eighth ACM Workshop on Hot
Topics in Networks (HotNets-VIII). New York City, NY, USA.
Hamed Haddadi, Heidi Howard, Amir Chaudhry, Jon Crowcroft, Anil
Madhavapeddy, Derek McAuley, Richard Mortier, "Personal Data:
Thinking Inside the Box”, The 5th decennial Aarhus conference (Aarhus
2015), August 2015
Haddadi, Hamed, Pan Hui, and Ian Brown. 2010. “MobiAd: private and
scalable mobile advertising.” In Proceedings of the fifth ACM
International Workshop on Mobility in the Evolving Internet
Architecture, 33–38. MobiArch ’10. Chicago, Illinois, USA: ACM. isbn:
978-1-4503-0143-5. doi:10.1145/ 1859983.1859993.
Haddadi, Hamed, Richard Mortier, Derek McAuley, and Jon Crowcroft.
2013. Human-data interaction. Technical report UCAM-CL-TR-837.
Computer Laboratory, University of Cambridge, June.
Huberman, Bernardo A. 2012. “Sociology of science: Big data deserve a
bigger audience[73]. ” Nature 482, no. 7385 (February 16): 308. issn:
1476-4687. doi:10. 1038/482308d.
Ioannidis, John P. A. 2013. “Informed Consent, Big Data, and the
Oxymoron of Research That Is Not Research.” American Journal of
Bioethics 13, no. 4 (March 20): 40–42.
doi:10.1080/15265161.2013.768864.
Jacobs, Rachel, Steve Benford, Mark Selby, Michael Golembewski,
Dominic Price, and Gabriella Giannachi. 2013. A conversation[74]
between trees: what data feels like in the forest. In Proceedings of the
SIGCHI Conference on Human Factors in Computing Systems (CHI '13).
ACM, New York, NY, USA, 129-138.
DOI=http://dx.doi.org/10.1145/2470654.2470673[75]
Jacobs, Rachel, Steve Benford, Ewa Luger, and Candice Howarth. 2016.
The Prediction Machine: Performing Scientific and Artistic Process. In
Proceedings of the 2016 ACM Conference on Designing Interactive
Systems (DIS '16). ACM, New York, NY, USA, 497-508. DOI:
http://dx.doi.org/10.1145/2901790.2901825[76]
Kapadia, Apu, Tristan Henderson, Jeffrey J. Fielding, and David Kotz.
2007. “Virtual Walls: Protecting Digital Privacy in Pervasive
Environments.” Chap. 10 in Proceedings of the 5th International
Conference on Pervasive Computing, edited by Anthony LaMarca, Marc
Langheinrich, and Khai N. Truong, 4480:162–179. Lecture Notes in
Computer Science. Toronto, ON, Canada: Springer Berlin / Heidelberg,
May. isbn: 978-3-540-72036-2. doi:10.1007/978-3-540-72037-9 10.
Kee, Kerk F., Larry D. Browning, Dawna I. Ballard, and Emily B. Cicchini.
2012. “Sociomaterial processes, long term planning, and infrastructure
funding: Towards effective collaboration and collaboration tools for
visual and data analytics.” In Presented at the NSF sponsored Science of
Interaction for Data and Visual Analytics Workshop. Austin, TX, March.
Kellingley, Nick. 2015. Human Data Interaction (HDI): The New
Information Frontier. https://www.interaction-
design.org/literature/article/human-datainteraction-hdi-the-new-
information-frontier, November.
Kum, Hye-Chung, Ashok Krishnamurthy, Ashwin Machanavajjhala, and
Stanley C. Ahalt. 2014. “Social Genome: Putting Big Data to Work for
Population Informatics.” Computer 47, no. 1 (January): 56–63. issn:
0018-9162. doi:10.1109/mc.2013.405.
Kumar, Santosh, Wendy Nilsen, Misha Pavel, and Mani Srivastava. 2013.
“Mobile Health: Revolutionizing Healthcare Through Transdisciplinary
Research.” Computer 46, no. 1 (January): 28–35. issn: 0018-9162.
doi:10.1109/mc. 2012.392.
Lanier, Jaron. 2013. Who Owns The Future? New York, NY, USA: Simon
& Schuster.
Leon, Pedro G., Justin Cranshaw, Lorrie F. Cranor, Jim Graves, Manoj
Hastak, Blase Ur, and Guzi Xu. 2012. “What do online behavioral
advertising privacy disclosures communicate to users?” In Proceedings
of the 2012 ACM workshop on Privacy in the electronic society, 19–30.
WPES ’12. Raleigh, North Carolina, USA: ACM. isbn: 978-1-4503-1663-7.
doi:10.1145/2381966. 2381970.
Leontiadis, Ilias, Christos Efstratiou, Marco Picone, and Cecilia Mascolo.
2012. “Don’T Kill My Ads!: Balancing Privacy in an Ad-supported Mobile
Application Market.” In Proceedings of the Twelfth Workshop on Mobile
Computing[77] Systems & Applications, 2:1–2:6. HotMobile ’12. San
Diego, California: ACM. isbn: 978-1-4503-1207-3.
doi:10.1145/2162081.2162084.
Luger, Ewa, Stuart Moran, and Tom Rodden. 2013. “Consent for All:
Revealing the Hidden Complexity of Terms and Conditions.” In
Proceedings of the SIGCHI Conference on Human Factors in Computing
Systems, 2687– 2696. CHI ’13. Paris, France: ACM. isbn: 978-1-4503-
1899-0. doi:10.1145/ 2470654.2481371.
Luger, Ewa, and Tom Rodden. 2013. “An Informed View on Consent for
UbiComp.” In Proceedings of the 2013 ACM International Joint
Conference on Pervasive and Ubiquitous Computing, 529–538. UbiComp
’13. Zurich, Switzerland: ACM. isbn: 978-1-4503-1770-2.
doi:10.1145/2493432.2493446.
Madden, Mary, Susannah Fox, Aaron Smith, and Jessica Vitak. 2007.
Digital Footprints: Online identity management and search in the age of
transparency. PEW Internet & American Life Project. Retrieved Feb. 23,
2014 from http://www.pewinternet.org/files/old-media//Files/R...[78] PIP
Digital Footprints.pdf.pdf. 1615 L ST., NW – Suite 700 Washington, D.C.
20036: PEW Internet, December.
Mashhadi, Afra, Fahim Kawsar, and Utku G. Acer. 2014. “Human Data
Interaction in IoT: The ownership aspect.” In IEEE World Forum on
Internet of Things (WF-IoT), 159–162. March. doi:10.1109/WF-
IoT.2014.6803139.
Mayer-Schonberger, V. 2009. Delete: The Virtue of Forgetting in the
Digital Age. Princeton University Press. isbn: 9781400831289.
McAuley, Derek, Richard Mortier, and James Goulding. 2011. “The
Dataware Manifesto.” In Proceedings of the 3rd IEEE International
Conference on Communication Systems and Networks (COMSNETS).
Invited paper. Bangalore, India, January.
MIT Technology Review. 2015. The Emerging Science of Human-Data
Interaction. http://www.technologyreview.com/view/533901/the-
emerging-scienceof-human-data-interaction/, January.
Mortier, Richard, Hamed Haddadi, Tristan Henderson, Derek McAuley,
and Jon Crowcroft. 2014. “Human-Data Interaction: The Human Face of
the Data-Driven Society.” http://dx.doi.org/10.2139/ssrn.2508051, SSRN
(October).
Murphy, Kate. 2014. “We Want Privacy, but Can’t Stop Sharing.”
http://www. nytimes.com/2014/10/05/sunday-review/we-want-privacy-
but-cant-stopsharing.html, New York Times (October).
Naehrig, Michael, Kristin Lauter, and Vinod Vaikuntanathan. 2011. “Can
Homomorphic Encryption Be Practical?” In Proc. ACM Cloud Computing
Security Workshop, 113–124. Chicago, Illinois, USA. isbn: 978-1-4503-
1004-8. doi:10.1145/2046660.2046682.
Naughton, John. 2015. “Fightback against internet giants’ stranglehold
on personal data starts here.”
http://www.theguardian.com/technology/2015/[79] feb/01/control-
personal-data-databox-end-user-agreement, The Guardian (February).
Nissenbaum, Helen F. 2004. “Privacy as Contextual Integrity.”
Washington Law Review 79, no. 1 (February): 119–157.
Ohm, Paul. 2010. “Broken Promises of Privacy: Responding to the
Surprising Failure[80] of Anonymization.”
http://uclalawreview.org/pdf/57-6-3.pdf, UCLA Law Review 57:1701–
1778.
Organisation for Economic Co-operation and Development. 2013.
Exploring the Economics of Personal Data - A Survey of Methodologies
for Measuring Monetary Value. Technical report, OECD Digital Economy
Papers 220. OECD, April 2. doi:10.1787/5k486qtxldmq-en.
O’Rourke, JoAnne M., Stephen Roehrig, Steven G. Heeringa, Beth G.
Reed, William C. Birdsall, Margaret Overcashier, and Kelly Zidar. 2006.
“Solving Problems of Disclosure Risk While Retaining Key Analytic Uses
of Publicly Released Microdata.” Journal of Empirical Research on
Human Research Ethics 1, no. 3 (September): 63–84. issn: 1556-2646.
doi:10.1525/jer.2006. 1.3.63.
Oxford English Dictionary. 2014.
http://www.oed.com/view/Entry/296948, February.
Pariser, Eli. 2011. The Filter Bubble: What the Internet Is Hiding from
You. New York, NY, USA: Penguin Press, May. isbn: 1594203008.
Patil, Sameer, Roman Schlegel, Apu Kapadia, and Adam J. Lee. 2014.
“Reflection or Action?: How Feedback and Control Affect Location
Sharing Decisions.” In Proceedings of the ACM SIGCHI Conference on
Human Factors in Computing Systems, 101–110. Toronto, ON, Canada,
April. doi:10.1145/ 2556288.2557121.
Pentland, Alex. 2012. “Reinventing society in the wake of Big Data.”
http: //edge.org/conversation/reinventing-society-in-the-wake-of-big-
data, Edge (August).
Ricci, Francesco, Lior Rokach, Bracha Shapira, and Paul B. Kantor. 2010.
Recommender Systems Handbook. 1st. New York, NY, USA: Springer-
Verlag New York, Inc. isbn: 0387858199, 9780387858197.
Schmidt, Kjeld. 1994. Social Mechanisms of Interaction. Technical report
COMIC Deliverable 3.2. ISBN 0-901800-55-4. Esprit Basic Research
Action 6225.
Shilton, Katie. 2012. “Participatory personal data: An emerging research
challenge for the information sciences.” Journal of the American Society
for Information Science and Technology 63, no. 10 (October): 1905–
1915. issn: 1532-2882. doi:10.1002/asi.22655.
Shilton, Katie, Jeff Burke, Deborah Estrin, Ramesh Govindan, Mark
Hansen, Jerry Kang, and Min Mun. 2009. “Designing the Personal Data
Stream: Enabling Participatory Privacy in Mobile Personal Sensing.” In
Proceedings of the 37th Research Conference on Communication,
Information and Internet Policy (TPRC). Arlington, VA, USA, September.
Solove, Daniel J. 2013. “Privacy Self-Management and the Consent
Dilemma.” Harvard Law Review 126, no. 7 (May): 1880–1903.
Star, Susan Leigh. 2010. “This is Not a Boundary Object: Reflections on
the Origin of a Concept.” Science, Technology & Human Values 35 (5):
601–617. doi:10.1177/0162243910377624. eprint:
http://sth.sagepub.com/content/[81] 35/5/601.full.pdf+html.
Star, Susan Leigh, and James R. Griesemer. 1989. “Institutional Ecology,
‘Translations’ and Boundary Objects: Amateurs and Professionals in
Berkeley’s Museum of Vertebrate Zoology, 1907-39.” Social Studies of
Science 19 (3): 387–420. doi:10.1177/030631289019003001.
Strategic Headquarters for the Promotion of an Advanced Information
and Telecommunications Network Society. 2014. Policy Outline of the
Institutional Revision for Utilization of Personal Data.
http://japan.kantei.go.jp/policy/it/20140715_2.pdf[82].
Taddicken, Monika, and Cornelia Jers. 2011. “The Uses of Privacy
Online: Trading a Loss of Privacy for Social Web Gratifications?” In
Privacy Online: Perspectives on Privacy and Self-Disclosure in the Social
Web, 1st ed., edited by Sabine Trepte and Leonard Reinecke, 143–156.
Springer-Verlag Berlin Heidelberg. isbn: 978-3-642-21520-9. doi:
10.1007/978-3-642-21521-6_11.
The Open Data Institute. http://theodi.org/.
Tolmie, P., A. Crabtree, T. Rodden, C. Greenhalgh, and S. Benford. 2007.
“Making the home network at home: digital housekeeping.” In
Proceedings of ECSCW, 331–350. Limerick, Ireland: Springer.
Trudeau, Stephanie, Sara Sinclair, and Sean W. Smith. 2009. “The effects
of introspection on creating privacy policy.” In WPES ’09: Proceedings of
the 8th ACM workshop on Privacy in the electronic society, 1–10.
Chicago, IL, USA: ACM, November. isbn: 978-1-60558-783-7.
doi:10.1145/1655188. 1655190.
US Consumer Privacy Bill of Rights. 2012. Consumer Data Privacy in a
Networked World: A Framework for Protecting Privacy and Promoting
Innovation in the Global Digital Economy.
https://www.whitehouse.gov/sites/[83] default/files/privacy-final.pdf,
February.
Vallina-Rodriguez, Narseo, Jay Shah, Alessandro Finamore, Yan
Grunenberger, Konstantina Papagiannaki, Hamed Haddadi, and Jon
Crowcroft. 2012. “Breaking for commercials: characterizing mobile
advertising.” In Proceedings of the 2012 ACM Internet Measurement
Conference, 343–356. Boston, MA, USA: ACM, November. isbn: 978-1-
4503-1705-4. doi:10.1145/2398776. 2398812.
Westby, Jody R. 2011. “Legal issues associated with data collection &
sharing.” In Proceedings of the First Workshop on Building Analysis
Datasets and Gathering Experience Returns for Security, 97–102.
Salzburg, Austria, April. isbn: 978-1-4503-0768-0.
doi:10.1145/1978672.1978684.
Whitley, Edgar A. 2009. “Informational privacy, consent and the
“control” of personal data.” Information Security Technical Report 14,
no. 3 (August): 154–159. issn: 13634127. doi:10.1016/j.istr.2009.10.001.
Winstein, Keith. 2015. “Introducing the right to eavesdrop on your
things.” http://www.politico.com/agenda/story/2015/06/internet-of-
things-privacyconcerns-000107, The Agenda Magazine (July).
World Economic Forum. 2011. Personal Data: The Emergence of a New
Asset Class. http://www.weforum.org/reports/personal-data-emergence-
newasset-class. In collaboration with Bain & Company, January.
Zaslavsky, Arkady, Charith Perera, and Dimitrios Georgakopoulos. 2012.
“Sensing as a Service and Big Data.” In Proceedings of the International
Conference on Advances in Cloud Computing (ACC). Bangalore, India,
July.
Links
1. https://www.interaction-design.org/literature/topics/create
2. https://www.interaction-design.org/literature/topics/facebook
3. https://www.interaction-design.org/literature/topics/twitter
4. https://www.interaction-design.org/literature/topics/hci
5. https://www.interaction-design.org/literature/topics/shape
6. https://www.interaction-design.org/literature/topics/value
7. https://www.interaction-design.org/literature/topics/scale
8. https://www.interaction-design.org/literature/topics/concerns
9. https://www.interaction-design.org/literature/topics/google
10. https://www.interaction-design.org/literature/topics/dominance
11. https://www.interaction-design.org/literature/topics/innovation
12. https://www.interaction-design.org/literature/topics/context
13. https://www.interaction-design.org/literature/topics/social-media
14. https://www.interaction-design.org/literature/topics/data-collection
15. https://www.interaction-design.org/literature/topics/portfolios
16. https://www.interaction-design.org/literature/topics/interests
17. https://www.interaction-design.org/literature/topics/developers
18. https://www.interaction-design.org/literature/topics/issues
19. https://www.interaction-design.org/literature/topics/information-
overload
20. https://www.interaction-design.org/literature/topics/contrast
21. https://www.interaction-design.org/literature/topics/user-control
22. https://www.interaction-design.org/literature/topics/collaboration
23. https://www.interaction-design.org/literature/topics/tailoring
24. https://www.interaction-design.org/literature/topics/loyalty
25. https://www.interaction-design.org/literature/topics/heart
26. https://www.interaction-design.org/literature/topics/principles
27. https://www.interaction-design.org/literature/topics/recognition
28. https://www.interaction-design.org/literature/topics/storage
29. https://www.interaction-design.org/literature/topics/modify
30. https://www.interaction-design.org/literature/topics/engagement
31. https://www.interaction-design.org/literature/topics/patterns
32. https://www.interaction-design.org/literature/topics/data-visualisation
33. https://www.interaction-design.org/literature/topics/environment
34. https://www.interaction-design.org/literature/topics/design
35. https://www.interaction-design.org/literature/topics/challenge
36. https://www.interaction-design.org/literature/topics/feedback
37. https://www.interaction-design.org/literature/topics/balance
38. https://www.interaction-design.org/literature/topics/experiment
39. http://theodi.org/
40. https://www.interaction-design.org/literature/topics/analogies
41. https://www.interaction-design.org/literature/topics/point-of-view
42. https://www.interaction-design.org/literature/topics/human-data-
interaction
43. https://www.interaction-design.org/literature/topics/practice
44. https://www.interaction-design.org/literature/topics/line
45. https://www.interaction-design.org/literature/topics/reverse
46. https://www.interaction-design.org/literature/topics/hand
47. https://www.interaction-design.org/literature/topics/respect
48. https://www.interaction-design.org/literature/topics/expectations
49. https://www.interaction-design.org/literature/topics/design-challenge
50. https://www.interaction-design.org/literature/topics/decide
51. https://www.interaction-design.org/literature/topics/play
52. https://www.interaction-design.org/literature/topics/social-networking
53. https://www.interaction-design.org/literature/topics/discoverability
54. https://www.interaction-design.org/literature/topics/deliver
55. https://www.interaction-design.org/literature/topics/combine
56. https://www.interaction-design.org/literature/topics/learning
57. https://www.interaction-design.org/literature/topics/analogy
58. https://www.interaction-design.org/literature/topics/physical-
environment
59. https://www.interaction-design.org/literature/topics/client
60. https://turing.ac.uk/
61. http://www.itutility.ac.uk/
62. http://www.nytimes.com/2014/10/05/sundayreview/we-want-privacy-
but-cant-stop-sharing.html
63. https://www.interaction-design.org/literature/topics/review
64. http://www.technologyreview.com/view/533901/the-emerging-science-
of-human-datainteraction
65. https://www.interaction-design.org/literature/article/
66. https://www.interaction-design.org/literature/topics/yvonne-rogers
67. https://www.interaction-design.org/literature/topics/cognitive-science
68. https://www.interaction-design.org/literature/topics/networking
69. https://www.interaction-design.org/literature/topics/human-factors
70. https://www.interaction-design.org/literature/topics/computer-
supported-cooperative-work
71. https://www.interaction-design.org/literature/topics/movement
72. https://www.interaction-design.org/literature/topics/data-visualization
73. https://www.interaction-design.org/literature/topics/audience
74. https://www.interaction-design.org/literature/topics/conversation
75. http://dx.doi.org/10.1145/2470654.2470673
76. http://dx.doi.org/10.1145/2901790.2901825
77. https://www.interaction-design.org/literature/topics/mobile-computing
78. http://www.pewinternet.org/%EF%AC%81les/old-
media//Files/Reports/2007/
79. http://www.theguardian.com/technology/2015/
80. https://www.interaction-design.org/literature/topics/failure
81. http://sth.sagepub.com/content/
82. http://japan.kantei.go.jp/policy/it/20140715_2.pdf
83. https://www.whitehouse.gov/sites/
Create account