0% found this document useful (0 votes)
14 views37 pages

Data Privacy

Uploaded by

tanvi29khanna
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views37 pages

Data Privacy

Uploaded by

tanvi29khanna
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 37

What is Data Ethics

• Responsible use of data.


• Data processes should be designed as sustainable solutions benefitting
first and foremost humans.
• Adhere to the principles and values on which human rights and
personal data protection laws are based.
• Genuine transparency in data management.
• Actively develop privacy-by-design and privacy-enhancing products and
infrastructures.
• To treat someone else's personal information as you wish your own, or
your children's, treated.
Which of these life-impacting events, both positive and
negative, might be the direct result of data practices

• A. Rosalina, a promising and hard-working law intern with a mountain of


student debt and a young child, is denied a promotion at work that would
have given her a livable salary and a stable career path, even though her work
record made her the objectively best candidate for the promotion.
• B. John, a middle-aged father of four, is diagnosed with an inoperable,
aggressive, and advanced brain tumor. Though a few decades ago his tumor
would probably have been judged untreatable.
• C. The Patels, a family of five living in an urban floodplain in India, receive
several days advance warning of an imminent, epic storm that is almost
certain to bring life-threatening floodwaters to their neighborhood. They and
their neighbors now have sufficient time to gather their belongings and safely
evacuate to higher ground.
• Absence of widespread, well-designed standards for data practice in
industry, university, non-profit, and government sectors has created a
‘perfect storm’ of ethical risks.
• Managing those risks wisely requires understanding the vast potential
for data to generate ethical benefits as well
• What makes a harm or benefit ‘ethically significant’?
• What significant ethical benefits and harms are linked to data.
• Data has the potential to significantly impact all fundamental interests of
human beings.
• Data practitioners must confront a far more complex ethical landscape
ETHICALLY SIGNIFICANT BENEFITS OF
DATA PRACTICES
• Human understanding
• Twenty-five quintillion bytes of data are generated every day.
• 25,000,000,000,000,000,000.
• More we understand about the world and how it works, the more intelligently
we can act in it
• Social, institutional, and economic efficiency
• Reduces wasted effort and resources and improves the alignment between a
social system or institution’s policies/processes and our goals.
• Predictive accuracy and personalization
• Precisely tailor actions to be effective in achieving good outcomes for specific
individuals, groups, and circumstances
ETHICALLY SIGNIFICANT HARMS
OF DATA PRACTICES
• Harms to privacy & security
• Reputational, economic and emotional harm.
• Data does not stay confined to the digital context in which it was originally shared.
• Harms to fairness and justice
• Arbitrariness; avoidable errors and inaccuracies; and unjust and often hidden biases
• Implicit biases due to inadequate design and testing of data analytics; or a lack of
careful training and auditing. Ex. Criminal risk predictive algorithms.
• Garbage in, garbage out
• Harms to transparency and autonomy
• Complex data technique
• Trade secret and proprietary technology

Case of Loan Application


Ethical challenges for data
practitioners and users
• ETHICAL CHALLENGES IN APPROPRIATE DATA COLLECTION AND USE
How can we properly acknowledge and respect the purpose for, and context within which, certain
data was shared with us or generated for us?
How can we avoid unwarranted or indiscriminate data collection?
Have we adequately considered the ethical implications of selling or sharing subjects’ data with
third-parties?
Have we given data subjects appropriate forms of choice in data sharing?
Are the terms of our data policy laid out in a clear, direct, and understandable way, and made
accessible to all data subjects?
Are data subjects given clear paths to obtaining more information or context for a data practice?
Are data subjects being appropriately compensated for the benefits/value of their data?
Have we considered what control or rights our data subjects should retain over their data?
• DATA STORAGE, SECURITY AND RESPONSIBLE DATA STEWARDSHIP
How can we responsibly and safely store personally identifying information?
Have we reflected on the ethical harms that may be done by a data breach, both in the short-
term and long-term, and to whom?
What are our concrete action plans for the worst-case-scenarios
Have we made appropriate investments in our data security/storage infrastructure
What privacy-preserving techniques such as data anonymization, and differential privacy do we
rely upon, and what are their various advantages and limitations?
What are the ethical risks of long-term data storage? How long we are justified in keeping
sensitive data, and when/how often should it be eliminated?
Do we have an end-to-end plan for the lifecycle of the data we collect or use.
What measures should we have in place to allow data to be deleted, corrected, or updated by
affected/interested parties?
• DATA HYGIENE AND DATA RELEVANCE
Inaccuracy and inconsistency in data
Practices and procedures for validation and auditing of data.
Connecting data from different sets
Tools and practices for data cleaning
Diversity of the data sources and/or training datasets
How long is this data likely to remain accurate, useful or relevant?
• IDENTIFYING AND ADDRESSING ETHICALLY HARMFUL DATA BIAS
Harmful human biases are reflected in data.
Distinguished carefully between the forms of bias we should want to
be reflected in our data or application, and those that are harmful or
otherwise unwarranted?
Have we sufficiently understood how this bias could do harm, and to
whom?
How might harmful or unwarranted bias in our data get magnified,
transmitted, obscured, or perpetuated by our use of it?
• VALIDATION AND TESTING OF DATA MODELS & ANALYTICS
Adequately testing of analytics/data models to validate their
performance
Understanding of ethical harms due to inadequate validation and
testing,
Ethical challenges might due to lack of transparency
Reliability across new, unexpected contexts
Consequences of analytics if audited for different and unjust
outcomes
• HUMAN ACCOUNTABILITY IN DATA PRACTICES AND SYSTEMS
Responsibility and accountability of data ethics in organization
Organizational structure for data ethics.
Systems in place to avoid any ethical harm
Process to allow an affected party to appeal the result or challenge
the use of a data practice?
Open for public inspection and comment
• EFFECTIVE CUSTOMER/USER TRAINING IN USE OF DATA AND
ANALYTICS
Training of employees to ethically use customer data
Accurate view of the limits and proper use of the data, data practice
or system.
• UNDERSTANDING PERSONAL, SOCIAL, AND BUSINESS IMPACTS OF
DATAPRACTICE
Considering usage of data practices and its impact on data subjects or
other parties later on?
Sufficient representation of all stakeholders to know correct picture.
Data should not violate anyone’s legal or moral rights, limit their
fundamental human capabilities, or otherwise damage their
fundamental life interests?
Best practices for data ethics
• Keep Data Ethics in the Spotlight—and Out of the Compliance Box
• Consider the Human Lives and Interests Behind the Data
• Focus on Downstream Risks and Uses of Data
• Don’t Miss the Forest for the Trees: Envision the Data Ecosystem
• Mind the Gap Between Expectations and Reality
• Treat Data as a Conditional Good
• Avoid Dangerous Hype and Myths around ‘Big Data’
• Establish Chains of Ethical Responsibility and Accountability
• Practice Data Disaster Planning and Crisis Response
• Promote Values of Transparency, Autonomy, and Trustworthiness
• Consider Disparate Interests, Resources, and Impacts
• Invite Diverse Stakeholder Input
• Design for Privacy and Security
• Make Ethical Reflection & Practice Standard, Pervasive, Iterative, and Rewarding
Data Ethics and Digital Trust
• Trust is the foundation of our relationships in a digital society and the
treatment of privacy is the balance established between companies and
people.
• Firms must be privacy optimist
• Trust a key prerequisite for online interactions
• Digital trust is lowest in case of online shopping
SNOWDEN EFFECT
• In the name of “Edward Snowdown”
• the large scale political, cultural and economic fallout
• PRISM program
• Trust is about creating transparency for their customers concerning
their interactions with the US government.
THE SHARING ECONOMY
• Trust is profit.
• Mediate trust between private individuals by giving them the tools to
verify or to create expectations about each other and the products
and services they use.
• Reputation capital is a key for collaborative consumption
How to Achieve Trust
• Trust is about expectations
• Certifications from third party can help in achieving trust
• Recommendations
• Privacy Branding

Failing to deliver on a promise of privacy is more fatal


than not making any promises at all.
GDPR
• European General Data Protection Regulation came into effect in 2018.
• Intended to harmonize privacy and data protection laws across Europe
while helping EU citizens to better understand how their personal
information was being used, and encouraging them to file a complaint if
their rights were violated.
• The digital economy — fueled by (personal) information — should
operate with the informed consent of users and clear rules for
companies who seek to do business in the European Union.
GDPR
• This Regulation lays down rules relating to the protection of natural
persons with regard to the processing of personal data and rules
relating to the free movement of personal data.
• This Regulation protects fundamental rights and freedoms of natural
persons and in particular their right to the protection of personal
data.
• The free movement of personal data within the Union shall be neither
restricted nor prohibited for reasons connected with the protection of
natural persons with regard to the processing of personal data.
When you are allowed to process
data?
• The data subject gave you specific, unambiguous consent to process the data. (e.g. They’ve opted
in to your marketing email list.)
• Processing is necessary to execute or to prepare to enter into a contract to which the data subject
is a party. (e.g. You need to do a background check before leasing property to a prospective tenant.)
• You need to process it to comply with a legal obligation of yours. (e.g. You receive an order from
the court in your jurisdiction.)
• You need to process the data to save somebody’s life. (e.g. Well, you’ll probably know when this
one applies.)
• Processing is necessary to perform a task in the public interest or to carry out some official
function. (e.g. You’re a private garbage collection company.)
• You have a legitimate interest to process someone’s personal data. This is the most flexible lawful
basis, though the “fundamental rights and freedoms of the data subject” always override your
interests, especially if it’s a child’s data.
Consent for Data use
• Consent must be “freely given, specific, informed and unambiguous.”
• Requests for consent must be “clearly distinguishable from the other
matters” and presented in “clear and plain language.”
• Data subjects can withdraw previously given consent whenever they
want, and you have to honor their decision. You can’t simply change
the legal basis of the processing to one of the other justifications.
• Children under 13 can only give consent with permission from their
parent.
• You need to keep documentary evidence of consent.
What types of privacy data does the GDPR
protect?

• Basic identity information such as name, address and ID numbers


• Web data such as location, IP address, and RFID tags
• Health and genetic data
• Biometric data
• Racial or ethnic data
• Political opinions
Which companies does the
GDPR affect?
• A presence in an EU country.
• No presence in the EU, but it processes personal data of European
residents.
• More than 250 employees.
• Fewer than 250 employees but its data-processing impacts the rights
and freedoms of data subjects, is not occasional, or includes certain
types of sensitive personal data.
General Ethical Framework: VIRTUE
ETHICS
• Oldest theory of ethics
• Virtue ethics focuses not on rules for good or bad actions, but on the qualities of morally excellent persons
• A virtue is a moral characteristic that a person needs to live well
• Deal with wider questions—“How should I live?” and “What is the good life?” and “What are proper family and
social values?”
• Emphasize the role of character and virtue in moral philosophy
• How can virtue ethics help us to understand what our moral obligations are?
• Three Ways:
 By helping to see that we have a basic moral obligation to make a consistent and conscious effort to develop our
moral character for the better
 Virtue theories can tell us is where to look for standards of conduct to follow; virtue theories tell us to look for
them in our own societies, in those special persons who are exemplary human beings with qualities of character
(virtues) to which we should aspire
 Direct us toward the lifelong cultivation of practical wisdom or good moral judgment
UTILITARIAN ETHICS
• Principle of the ‘greatest good’ to determine what our moral
obligations are in any given situation.
• Determines right from wrong by focusing on outcomes.
• Advocates actions that foster happiness or pleasure and oppose
actions that cause unhappiness or harm.
• The ‘good’ in utilitarian ethics is measured in terms of happiness or
pleasure
DEONTOLOGICAL ETHICS
• Some acts are morally obligatory regardless of their consequences for
human welfare.
• Action itself is right or wrong under a series of rules and principles
• Consequences don’t matter
• Act in ways that we would be willing to have all other persons follow
Questions?
• Who do you consider a model of moral excellence that you see as an
example of how to live, and whose qualities of character you would like
to cultivate?
• What are three strengths of moral character (virtues) that you think are
particularly important for data practitioners to practice and cultivate in
order to be excellent models of data practice in their profession?
• In what ways do you think data practitioners can promote the ‘greater
good’ through their work, that is, increase human happiness?
• What are two cases you can think of in data practice in which a person or
persons were treated as a ‘mere means to an end’, that is, treated as
nothing more than a useful tool to achieve someone else’s goal?
Data Justice
• key framework for engaging with the intersection of datafication and
society in a way that privileges an explicit concern with social justice.
• Seeks to examine data issues in the context of existing power
dynamics, ideology and social practices, rather than as technical
developments in the interactions between information systems and
users.
• Fairness in the way people are made visible, represented and treated
as a result of their production of digital data.
Timeline of Data Justice
Six Pillars of Data Justice Research &
Practice
 The pillar of power demonstrates the importance of understanding the levels at which power operates and how
power manifests in the collection and use of data in the world. The articulation of this pillar provides a basis from
which to question power at its sources and to raise critical awareness of its presence and influence.
 The pillar of equity addresses the need to confront the root causes of data injustices as well as to interrogate choices
about the acquisition and use of data, particularly where the goal or purpose is to target and intervene in the lives of
historically marginalized or vulnerable populations.
 The pillar of access illuminates how a lack of access to the benefits of data processing is a starting point for reflection
on the impacts and prospects of technological interventions. The beginning of any and all attempts to protect the
interests of the vulnerable through the mobilization of data innovation should be anchored in reflection on the
concrete, bottom-up circumstances of justice and the real world problems at the roots of lived injustice.
 The pillar of identity addresses the social character of data and problematizes its construction and categorization,
which is shaped by the sociocultural conditions and historical contexts from which it is derived.
 The pillar of participation promotes the democratization of data scientific research and data innovation practices and
the need to involve members of impacted communities, policymakers, practitioners, and developers together to
collaboratively articulate shared visions for the direction that data innovation agendas should take.
 The pillar of knowledge involves recognizing that diverse forms of knowledge and understanding can add valuable
insights to the aspirations, purposes, and justifications of data use—including on the local or context-specific impacts
of data-intensive innovation. Inclusion of diverse knowledges and ways of being can open unforeseen paths to societal
and biospheric benefits and maximize the value and utility of data use across society in ways which take account of the
needs, interests, and concerns of all affected communities.
Levels of Power in the collection and
use of data
Equity
• Equity should be considered before data collection.
• Pursuit of data equity should be to transform historically rooted
patterns of domination and entrenched power differentials.
• Combat any discriminatory forms of data collection and use that
centre on disadvantage and negative characterization.
• Pursue measurement justice and statistical equity.
Access
• Providing people tangible paths to data justice by addressing the root causes of social,
political, and economic injustice.
• Equitably open access to data through responsible data sharing.
• Equitably advance access to research and innovation capacity.
• Equitably advance access to the capabilities of individuals, communities, and the
biosphere to flourish.
• Promote the airing and sharing of data injustices across communities through data
witnessing.
• Promote the airing and sharing of data injustices across communities through
transparency
 Process Transparency
 Outcome Transparency
 Professional and Institutional Transparency

You might also like