Operationalizing Privacy by Design - A Guide To Implementing Strong Privacy Practices PDF
Operationalizing Privacy by Design - A Guide To Implementing Strong Privacy Practices PDF
A Guide to Implementing
Strong Privacy Practices
December 2012
Acknowledgements
A considerable amount of work has gone into this paper and I am grateful
to my staff without whom this monumental task couldn’t be completed.
Special thanks to Ken Anderson, Michelle Chibba, Fred Carter, Estella Cohen,
Jeff Kirke and Sandra Kahale for their specific contribution. I also wish to
acknowledge the many partners who have joined me as co-authors in the
numerous papers written over the last several years, generously lending
their expertise to broadening the PbD experience.
416-326-3333
2 Bloor Street East 1-800-387-0073
Suite 1400 Fax: 416-325-9195
Toronto, Ontario TTY (Teletypewriter): 416-325-7539
Information and Privacy Commissioner, M4W 1A8 Website: www.ipc.on.ca
Ontario, Canada Canada Privacy by Design: www.privacybydesign.ca
Table of Contents
Executive Summary 1
Introduction 5
The Fundamentals 7
Privacy: A Practical Definition 7
The 7 Foundational Principles of Privacy by Design 8
Implementation Guidance 9
Conclusions 54
Appendices 55
Privacy by Design Papers Organized by Application Area 55
CCTV/Surveillance Cameras in Mass Transit Systems: 55
Biometrics Used in Casinos and Gaming Facilities 55
Smart Meters and the Smart Grid 56
Mobile Devices & Communications 56
Near Field Communications (NFC) 57
RFIDs and Sensor Technologies 57
Redesigning IP Geolocation Data 57
Remote Home Health Care 57
Big Data and Data Analytics 58
Foundational PbD Papers 58
I t has been almost 20 years since I developed the concept of Privacy by Design (PbD). Reflecting
on the widespread acceptance it currently enjoys within the public and private sectors, as
well as its endorsement by the International Association of Data Protection Authorities and
Privacy Commissioners, the U.S. Federal Trade Commission, the European Union and privacy
professionals, is particularly gratifying. While much has been accomplished, much work still
remains. The time has come to give deeper expression to PbD’s 7 Foundational Principles.
Over the past several years, my Office has produced over 60 PbD papers with many well-known
subject matter experts ranging from executives, risk managers, legal experts, designers, analysts,
software engineers, computer scientists, applications developers in telecommunications, health
care, transportation, energy, retail, marketing, and law enforcement.
While some of our papers are “foundational” works, much of our PbD research is directly
related to one of nine key application areas:
-1-
The good news is that new insights are beginning to emerge – a set of common messages,
each associated with the 7 Foundational Principles, has become apparent. This is particularly
important because it further validates our initial principles, which are considerably broader
in scope and extend well beyond Fair Information Practices. It is these “messages” with which
this paper is primarily concerned.
In this paper, as in many others, I begin by framing privacy as an issue of control – the need to
maintain personal control over the collection, use and disclosure of one’s personally identifiable
information. It is a concept that is best reflected in the German right of “informational self-
determination” and that the individual should be the one to determine the fate of his or
her personal information. Recognizing privacy as an exercise in control has always been
important, but it is critical today in an age characterized by far-reaching online social media
and ubiquitous computing.
Too often, issues of privacy and the protection of personal information are regarded as the domain
of large corporations – those with a Chief Privacy Officer or formal privacy infrastructure. This
is not the case. The Internet has proven to be such a tremendous leveller – today, relatively
small organizations may control disproportionately large volumes of PII. Every organization
bears a responsibility to understand its relationship with PII and strategize accordingly. I
believe that they would all benefit from embracing PbD. In this paper, I argue that it is not
the size or structure of the organization that matters, what matters is that someone is charged
with the responsibility of being accountable for the organization’s privacy protection. In a
large company, this may be the CPO, supported by application owners and developers; in a
smaller one, perhaps the founder is the one to be held accountable, relying on contracted IT
resource for support.
PbD, relying on building privacy in – early, robustly and systematically – across the business
ecosystem, yields meaningful benefits. Doing it right, the first time, has long been recognized as
a cost-saving strategy in multiple domains. Most importantly, however, the approach fosters an
environment where privacy harms are minimized or entirely prevented from happening, in the
first place. Imagine the cost savings in avoiding data breaches and the duty of care that follows.
-2-
In summary, the activities and responsibilities for each of the 7 Foundational Principles include:
1. Proactive not Reactive; Preventative not Remedial – The focus is on the role played
by organizational leadership/senior management in the formation, execution and
measurement of an actionable privacy program. Research and case studies pertaining
to the role of Boards of Directors, the definition of an effective privacy policy, the
execution of a “PbD Privacy Impact Assessment” (a truly holistic approach to privacy
and privacy risk management) and a “Federated PIA,” as well as a variety of other
applications contribute to further implementation guidance.
2. Privacy as the Default Setting – Focusing on a new group within the organization, we
examine the critical role that business and application owners play in the development
of privacy requirements – requirements that will be incorporated into processes and
technologies developed by software engineers. The importance of minimizing the
collection of personal information, purpose specification, use limitation and barriers
to data linkages is reinforced. A variety of technologies – IP geolocation, anonymous
digital signage, de-identification and biometric encryption – highlight specific innovative
solutions to this challenge.
3. Privacy Embedded into Design – Continuing to focus on staff with responsibility for
the delivery of privacy, we consider the role of a Privacy Risk Assessment. Further,
we stress the importance of the “Laws of Identity” and the incorporation of privacy in
system development lifecycles and the variety of regulatory approaches. Case studies
focusing on how privacy is embedded into design include: IBM and their Big Data
Sensemaking Engine; San Diego Gas and Electric’s incorporation of privacy into their
Smart Pricing Program; and, the application of specific privacy design features for the
mobile communication ecosystem.
7. Respect for the User – Keep it User-Centric – The privacy interests of the end-user,
customer or citizen are paramount. PbD demands that application and process developers
undertake a collection of activities to ensure that an individual’s privacy is protected
even if they take no explicit steps to protect it. Privacy defaults are key; clear notice
is equally important. Especially within complex systems (e.g. contemporary Social
Network Services), users should be provided with a wide range of privacy-empowering
options. We consider “Government 2.0” (an application of Web 2.0) and the critical
role that User Interface Design plays, as well as the emerging recognition of the value
of one’s personal information and the rise of the Personal Data Ecosystem. Finally, we
consider an intriguing and potentially powerful new form of Artificial Intelligence called
SmartData.
This is a lengthy paper – since it consists of the vast body of work previously published by
my office. But it is not a summary of those papers. It represents an in-depth review of that
work and a systematic consolidation and categorization of their seemingly disparate lessons.
I urge you to read this paper from beginning to end – the scope of the lessons and, more
importantly, their holistic interplay will hopefully entice you. Recognizing its length, however,
there are other reading strategies one may choose to adopt:
• Using the tables at the beginning of each principle, one can undertake a cursory survey
of the lessons and responsibilities associated with each of the 7 Principles.
• Choosing to focus on any single principle, one may dive deeply into our work by reviewing
the case studies summarized in each section.
• Those wishing to delve even more deeply, may wish to consult the references identified
and illustrated within. Using the electronic version of this paper, clicking on a source
or cover illustration will link you back to the original work.
Finally, two rather lengthy appendices are presented. The first, a chronological presentation
of our PbD work is useful in assessing the evolution of our approach to the topic. The second
groups the papers into either the “foundational” category or one of the nine key application areas.
Privacy by Design’s value as a privacy framework is now well recognized. There are many
organizations that have worked hard to achieve this gold standard for privacy and more
continue to implement PbD within their organization’s processes, applications and offerings.
Your work serves the cause of privacy – the protection of our personal information. For this,
you have my eternal thanks! Let us continue to work together to ensure that privacy grows
strong and prevails, well into the future.
-4-
Introduction
The momentum behind Privacy by Design (PbD) has been growing for the past several years.
Its global acceptance demonstrates that the power of the ‘build it in early’ approach to privacy
is truly without borders. Now, as this concept spreads, the question that remains is, “We
believe in PbD – but how do we do it?”
We are now at the stage where market and technology leaders, academics and regulators are
beginning to look at ways of translating the principles of PbD into technical and business
requirements, specifications, standards, best practices, and operational performance criteria.1
Having repeatedly said that PbD is not a theoretical construct, its actual application on the
ground must be demonstrated.
This paper provides an overview of the partnerships and joint projects that the Office of the
Information & Privacy Commissioner of Ontario, Canada (IPC) has been engaged in over the
years to operationalize Privacy by Design – providing concrete, meaningful operational effect
to its principles. Informed by a broad collection of papers, it represents insights from a wide
range of sectors, including telecommunications, health care, transportation, and energy.
Further, it draws on the perspectives and experiences of executives, risk managers, lawyers
and analysts, as well as engineers, designers, computer scientists and application developers,
to name a few – all working to pursue privacy based on the principles of Privacy by Design.
By reflecting on the experiences of others, it is my hope that privacy leaders will recognize an
approach for their organizations to follow or be inspired to create one of their own.
I also hope that, like the organizations highlighted here, new players will come forward to
share their experiences, lessons learned, and accomplishments, arising through alignment of
their organizations and operations with the principles of Privacy by Design, so that we may
continue to build much-needed expertise, and grow best practices, for the benefit of all.
1 See Viewpoint: Spiekermann, S. (July 2012). The Challenges of Privacy by Design. Communications of the ACM, 55, 7, p.38-
40; Gürses, S., Troncoso, C., & Diaz, C. (2011). Engineering Privacy by Design. Computers, Privacy & Data Protection; Kost,
M., Freytag, J. C., Kargl, F., & Kung, A. (August 22-26, 2011). Privacy Verification Using Ontologies. Paper presented at the
Availability, Reliability and Security (ARES), 2011 Sixth International Conference, Vienna; Rost, M., & Bock, K. (2011). Privacy
by Design and the New protection goals (pp. 9): Europrise, European Privacy Seal.
-5-
Privacy by Design Application Areas
1. CCTV/Surveillance Cameras in Mass Transit Systems;
2. Biometrics Used in Casinos and Gaming Facilities;
3. Smart Meters and the Smart Grid;
4. Mobile Devices & Communications;
5. Near Field Communications (NFC);
6. RFIDs and Sensor Technologies;
7. Redesigning IP Geolocation Data;
8. Remote Home Health Care;
9. Big Data and Data Analytics.
-6-
The Fundamentals
F rom a practical perspective, privacy is not about secrecy or preventing organizations from
collecting needed personal information as part of their role in providing goods or services
to their customers. Privacy is about control – maintaining personal control over the collection,
use, and disclosure of one’s personally identifiable information. It is best expressed by the
German concept of “informational self-determination,” a term first used in the context of a
constitutional ruling related to personal information collected during Germany’s 1983 census.
When comparing the leading privacy practices and codes from around the world, there are
principles and values that remain timeless and relevant to the age of the Internet. One
noteworthy enhancement that needs to be recognized is the concept of data minimization. This
reflects the view that programs, information technologies and systems should operate with
non-identifiable interactions and transactions, as the default condition. Wherever possible,
identifiability, observability and linkability of personal information should be minimized. In
his book Code 2.0 (2006), U.S. academic Lawrence Lessig famously wrote, “Code is Law.” He
notes: “As the world is now, code writers are increasingly lawmakers. They determine what
the defaults of the Internet will be; whether privacy will be protected; the degree to which
anonymity will be allowed; the extent to which access will be guaranteed. They are the ones
who set its nature. Their decisions, now made in the interstices of how the Net is coded,
-7-
define what the Net is.” By extension, he demonstrated that we could, and should, engineer
cyberspace to reflect and protect our fundamental values.
There are two essential Fair Information Practices (FIPs) that best characterize the essence of
data privacy – “purpose specification” and “use limitation.” Simply put, purpose specification
speaks to clearly identifying why an organization needs to collect personal information. Use
limitation refers to only using the data collected for the primary purpose specified. If the data
collected will be used for other secondary purposes, then the individual must be informed
and allowed to consent to the additional uses of his or her personal data.
These perspectives are fundamental to Privacy by Design (PbD) and inform its 7 Foundational
Principles.
Privacy by Design – embedding privacy into information technologies, business practices, and
networked infrastructures, as a core functionality, right from the outset – means building
in privacy right up front – intentionally, with forethought. PbD may thus be defined as an
engineering and strategic management approach that commits to selectively and sustainably
minimize information systems’ privacy risks through technical and governance controls. At
the same time, however, the Privacy by Design approach provides a framework to address
the ever-growing and systemic effects of ICTs and large-scale networked data systems with
enhancements to traditional FIPs. These are: 1) acting proactively; 2) making privacy the
default condition; 3) embedding privacy directly into design; and 4) taking a doubly-enabling
positive-sum (not zero-sum) approach in the face of multiple, competing objectives.
-8-
What does it mean to practice these principles? Operationalization is essential to PbD. It
extends the principles to a set of actionable guidelines that application and program owners
can communicate to those responsible for their implementation. Think of the 7 Principles
as a multi-pronged approach to achieving the highest standard of privacy protection, in an
ecosystem requiring broad participation.
The approach will vary depending upon the organization, the technology and other
variables. While there is no single way to implement, operationalize, or otherwise roll out
a PbD -based system, taking a holistic approach is key. The process necessarily challenges
programmers and engineers to think creatively about all of a system’s requirements, and
similarly challenges organizational leaders to innovate, test, and discover what works best
in their particular environment.
What is certain is that when these principles are applied early on, robustly, systematically,
and across the business ecosystem, they help to foster an environment where privacy harms
are minimized or prevented from happening in the first place. They also stimulate:
• clear privacy goal-setting;
• systematic, verifiable methodologies;
• practical solutions and demonstrable results; and
• vision, creativity, and innovation.
Examining the experiences of leading organizations in the application of the principles of
Privacy by Design is profoundly instructive, suggesting paths forward for others interested
in taking a comprehensive approach to responsible information management practices.
‘‘
The day started out with the Information and Privacy Commissioner
of Ontario, Canada – Dr. Ann Cavoukian – giving a presentation via
video to the group on Privacy by Design. … Now I have heard of Dr.
Cavoukian and the PbD movement. But I had never been exposed to
any details. The details were amazing and I like the 7 Foundational
‘‘
Principles. … These are sound principles that make a lot of sense.
Craig Burton, KuppingerCole blog on The 2012 International OASIS Cloud Symposium,
October 17, 2012
Implementation Guidance
Organizations range from small (i.e. a sole proprietorship) and medium-sized to the very large
(e.g. multinational corporations and governments) with structures that may be entrepreneurial,
hierarchical, functional, divisional or matrix (to name a few). Regardless of their size and
structure, however, any organization that encounters personal information must effectively
manage and protect it.
-9-
Everyone within the organization has a role to play with respect to the protection of Personally
Identifiable Information (PII). Yet this fails to bring us closer to appreciating precisely who is
responsible for what. To begin that discussion, we propose the following model:
The integration of privacy and the development of customer or citizen-facing offerings is based
on a set of privacy requirements, which, themselves, are reflected in an organization’s privacy
policies. The model recognizes that one or more individuals may perform some or all of the
roles identified. What is important is not that an organization explicitly identifies an individual
responsible for each role; rather, that each of the tasks is undertaken and accountably executed.
For example, in a very small business, the founder may play the role of “Board/CEO” and
“Chief Privacy Officer,” while a colleague or subordinate may act as the “Application Owner”
and “Programmer.”
Privacy policies support a culture of privacy. Intended to apply across the organization,
responsibility for their development and enforcement naturally falls to a senior member of the
leadership team – ideally a CPO. A properly defined set of privacy policies forms a backdrop
against which application owners and product developers develop specific sets of privacy
requirements to be embedded into their offerings. A CPO’s executional responsibility is
associated with the development of practices to ensure that privacy is consistently embedded
in applications and processes, and to audit them periodically to ensure that this is the case.
- 10 -
Privacy requirements are at the core of PbD execution. Informed by an organization’s privacy
policy, the 7 Foundational Principles of PbD and assisted by a variety of privacy-supportive
processes and practices, those deemed to “own” the customer-facing offerings bear primary
executional responsibility to ensure the development of a rich set of privacy requirements, as
well as their subsequent integration in the development process – from the outset. Further,
through the offering’s development lifecycle, working with the developers, they must ensure
that the requirements are satisfied and that deficiencies are identified and addressed. Once
the development process is complete, its approval affirms that each of the requirements has
been fully satisfied. Seeking guidance or assistance, should it be required, and updating the
CPO regarding the completed offering’s privacy status, rounds out the application owners’
privacy responsibilities.
Based on their understanding of the offering’s full suite of requirements, developers would
then create the actual offering. They will most likely need to innovate to satisfy PbD’s central
promise of “Full Functionality – Positive-Sum, not Zero-Sum.” Over time, however, as
privacy requirements become more commonplace, the task of embedding privacy will become
simplified and accelerated by the development of “privacy code libraries” – collections of code
that satisfy typical privacy requirements – similar in nature to those which currently exist
for other purposes.
The next section begins the process of systematizing and summarizing the lessons learned
implementing the 7 Foundational Principles that have formed the cornerstone of our collection
of PbD papers.
In this section, a consistent approach has been employed to assist those who seek to
implement the principles of PbD. Each principle is identified and defined with key dimensions
of the principle highlighted. A chart summarizes the “Actions” to implement the spirit of the
principle that are most closely associated with the principle and who within the organization
is accountable for their execution. Each principle is then informed by the insights gained by
working with organizations that have implemented PbD-based privacy programs, as well as
lessons learned from our long history of research in this area.
- 11 -
The 7 Foundational Principles
of Privacy by Design
1. Proactive not Reactive; Preventative not Remedial
The Privacy by Design (PbD) approach is characterized by proactive rather than reactive measures. It anticipates
and prevents privacy invasive events before they happen. PbD does not wait for privacy risks to materialize,
nor does it offer remedies for resolving privacy infractions once they have occurred – it aims to prevent them
from occurring. In short, Privacy by Design comes before-the-fact, not after.
- 12 -
Principle 1
Actions Responsibility
1. Affirm senior leadership commitment to a strong,
proactive privacy program.
Leadership/Senior
2. Ensure that concrete actions, not just policies,
Management
reflect a commitment to privacy. Monitor through
a system of regularly reviewed metrics.
- 13 -
O rganizations must begin with an explicit recognition of the value and benefits of proactively
adopting strong privacy practices, early and consistently. Addressing privacy at the
outset, prevents privacy issues from arising in the first place. This is the essence of Privacy by
‘‘
Design and a dimension where it exceeds traditional compliance frameworks. The alternative
is privacy by chance, or worse, privacy
by disaster (a term coined by Dr. Kai
Rannenberg) – harried efforts to limit
or remediate the damage that has
already been done. In our view, that
Intel views Privacy by Design as
a necessary component of our
accountability mechanisms that we
‘‘
is too little, too late, and represents implement in our product and service
how things were done in the past.
development processes.
With PbD, clear commitments must be David A. Hoffman, Director of Security Policy
made and resources allocated to back and Global Privacy Officer,
them up. This kind of executive-led Intel Corporation
approach fosters the development of a
culture of privacy across the entire organization. Such a culture enables sustained collective
action by providing staff with a similarity of approach, outlook and priorities. It is what leads
privacy to be woven into the fabric of the day-to-day operations of an organization, at all levels,
and supports the ultimate success of an organization’s privacy programs.
Accountability must be ensured, with clearly identified “business owners” who take on lead
responsibility. In this sense, a “business owner” is understood to be an individual who has
been explicitly identified as being accountable for the successful execution of one or more
privacy-related tasks. These may be executives, organizational privacy leaders, business
process owners, or project leaders. The Chief Privacy Officer, for example, is the “owner” of
the organization’s privacy policy. Similarly, a Brand or Product Manager is the “owner” of a
product or service and is accountable for its management of the PII with which it comes in
contact. The rise of the Chief Privacy Officer (CPO) role in organizations is a testament to the
strategic importance of good information management.
Guidance for Boards of Directors: What You Don’t Know Can Hurt You
Being proactive means that corporate directors, faced with a wide
array of responsibilities arising from their board membership,
must make oversight of the organization’s privacy policies and
procedures an integral and necessary component of effective
board service. This can be achieved through the following actions:
- 14 -
c) Directors should ensure that privacy compliance is a part of senior management
performance evaluation and compensation;
e) Directors should ensure that they ask senior management the right questions about
privacy practices in their organization.
Source: Privacy and Boards of Directors: What You Don’t Know Can Hurt You, November 2003
(Revised July 2007).
d) Develop and conduct privacy education and awareness training programs to ensure
that all employees understand the policies/practices required, as well as the obligations
they impose;
e) Designate a central “go-to” person for privacy-related queries within the organization;
f) Verify both employee and organizational execution of privacy policies and operational
processes and procedures; and
g) Proactively prepare for a potential privacy breach by establishing a data breach protocol
to effectively manage it.
Source: A Policy is Not Enough: It Must be Reflected in Concrete Practices, September 2012.
- 15 -
Of course, the optimal time to be proactive is when an information technology or a networked
infrastructure is new and emerging. By building privacy in from the outset – ideally as early
as the conceptual stage, it becomes possible to foster confidence and trust in the technology
or infrastructure as being privacy-protective, and ideally avoiding costly future retrofits.
All PIAs should have a modular nature, since most policies, governance frameworks and systems
are neither the purview nor the expertise of a single person. For that reason, the PIA should
have a coordinator or point person within the organization, often the Chief Privacy Officer.
The CPO or other privacy lead should assemble the organizational team required to review and
answer the PIA questions. In a corporate setting, that team would include representatives from
IT, customer service, security, marketing, risk management, and relevant lines of business.
This approach serves to provide greater meaning for participants not directly responsible for
privacy, and acts as a building block of the organization’s information governance and risk
management program. Optimally, the various owners/operators of the systems and other
framework elements will have been consulted in the development of the PIA, and the PIA
process will yield benefits to them, as well.
Conceiving of the PIA in this way helps those disciplines not specifically focused on privacy
to better understand the value of the review, its relevance to their job function, and the role
it plays in adding value to the organization.
By conducting this type of assessment proactively and early on, the privacy impacts of
the resulting technology, operation or information architecture, and their uses, should be
demonstrably minimized, and not easily degraded through use, misconfiguration or error
during the implementation.
a) The framework should be applied continuously at all stages (conceptual, physical and
logical) of the design, development and implementation of the information technology,
business process, physical space and networked infrastructure project;
- 16 -
b) Organizations should take into consideration the privacy
expectations of individuals regarding their information;
b) Demonstrate that privacy policies, as defined by the members of the Federation, will
be met;
- 17 -
Source: Joseph H. Alhadeff (co-author on behalf of the Liberty Alliance Project) – The New
Federated Privacy Impact Assessment (F-PIA) Building Privacy and Trust-enabled Federation,
January 2009.
Source: Marilyn Prosch (co-author, ASU Privacy by Design Research Lab) – The Roadmap for
Privacy by Design in Mobile Communications: A Practical Tool for Developers, Service Providers,
and Users, December 2010.
- 18 -
Guidance on Practices that Support a Culture of Continuous Improvement
in Privacy Protection: Organizational Tools and Frameworks
The following papers provide examples of approaches to
organizational tools and frameworks that have been developed
to support proactive privacy practices:
- 19 -
d) Nymity’s PbD Risk and Control Checklists support its
Privacy Risk Optimization Process (PROP) that is based on
the International Organization for Standardization (ISO)
concept that risk can be both positive and negative. Risk
Optimization is a process whereby organizations strive to
maximize positive risks and mitigate negative ones. The
PROP uses these concepts to implement privacy proactively
into operational policies and procedures.
‘‘
I want to congratulate you on the incredible achievement of what
I would call the Privacy by Design movement. Based on the OECD
and International Data Protection and Privacy Commissioners’
conferences in Israel it is clear that industry, government and NGOs
have all embraced PbD everywhere in the world. I say this based
‘‘
on both the conversations I had with individuals and the sessions I
attended. People understand and seem committed.
Terry McQuay, President, Nymity Inc.
Source: Co-authored with IESO – Building Privacy into Ontario’s Smart Meter Data Management
System: A Control Framework, May 2012.
- 20 -
Principle 2
Actions Responsibility
1. Adopt as narrow and specific a purpose(s) for
data collection as possible – begin with no Software Engineers &
collection of personally identifiable information Developers
– data minimization.
- 21 -
T he single most effective yet most challenging method of preserving privacy is to ensure
that the default settings – the settings that apply when the user is not required to take
any action – are as privacy-protective as possible. In operationalizing this principle, one might
think of the discipline of engineering privacy being examined by a number of academics (e.g.
S. Gurses, C. Troncosco and C. Diaz; 2011) on which there will be reliance when dealing with
back-end systems. Privacy management as a distinct discipline is becoming more standardized
and professionalized, with a growing demand for skilled privacy engineers and architects.
We want to encourage thinking beyond the default settings associated with preferences that
users can manually control, and to consider the overall system defaults.
The starting point for designing information technologies and systems must always be maximally
privacy-enhancing, beginning with NO collection of personally identifying information, unless
and until a specific and compelling purpose is defined. If this is the case, organizations should
seek to adopt as narrow and specific a purpose(s) for data collection as possible. “Specified
purposes should be clear, limited and relevant to the circumstances.”
This approach, referred to as “data minimization,” must be the first line of defence – non-
collection, non-retention and non-use of personal data. Similarly, the collection, use and
disclosure of aggregated or de-identified personal information raise few, if any, privacy issues.
Quite simply, personal data that is not collected, retained, or disclosed needs no securing,
management, or accounting – no duty of care arises, nor possibility of harm. Likewise, personal
data that does not exist in a database cannot be accessed, altered, copied, appended, shared,
lost, hacked, or otherwise used for secondary purposes by unauthorized third parties. All too
often, we apply the same requirements from the paper world to the digital world when in fact,
online systems require less data precisely because of the mathematical and computational
capabilities of technologies.
Where personal data must be collected for clearly specified purposes, the next step in
operationalizing this principle is to limit the uses and retention of that information, as much
as possible. The principles of purpose specification and use limitation, contained in FIPs,
best illustrate this point.
There are many ways in which this may be accomplished. One method is to carry out operations
with privacy implications (i.e. those that use personal information) client-side – that is, entirely
under the control of users and their devices. Obviously, the more tamper-proof, secure, and
user-controlled the device or software, the more trusted it will be to carry out its functions
reliably. Dividing data, functions, and roles among different entities is a proven method of
ensuring privacy. For example, this strategy is the basis for using proxy servers to obscure IP
addresses and to defeat online tracking and profiling. In practice, a combination of organizational
and technical measures will be necessary to achieve this goal of default privacy.
- 22 -
The default principle is illustrated in the following examples:
Source: Michael Ho, Co-author, Bering Media – Redesigning IP Geolocation: Privacy by Design
and Online Targeted Advertising, October 2010.
Source: With support from Intel – White Paper: Anonymous Video Analytics (AVA) technology
and privacy, April 2011.
3. De-Identification of Health Data: De-identified data is information that has had its
identifiers removed, but has not been combined or aggregated with other individuals’
data. It is a common approach to privacy protection and as a general rule can help protect
personal information in the event it is lost or stolen, making it more difficult to exploit for
nefarious purposes. Re-identification is extremely difficult in practice when appropriate
de-identification techniques are used. While de-identification remains an important tool,
the first approach should be data minimization in which data aggregation ensures that
individual data is not disclosed in the first place. Advanced de-identification methods
- 23 -
allow data custodians to exploit data without risking identity. Dr.
Khaled El Emam, Canada Research Chair in Electronic Health
Information, CHEO Research Institute and University of Ottawa,
has developed methodologies and de-identification algorithms to
manage risks related to re-identification, data theft and misuse.
Sources: Khaled El Emam, Ph.D., Co-author (Canada Research
Chair in Electronic Health Information, CHEO Research Institute
and University of Ottawa) – Dispelling the Myths Surrounding De-
identification: Anonymization Remains a Strong Tool for Protecting
Privacy, June 2011; A Positive-Sum Paradigm in Action in the
Health Sector, March 2010.
- 24 -
5. RFID On-Off Transmission Control: With a vicinity read
RFID chip embedded inside, the Enhanced Driver License
(EDL) was intended to communicate with readers at U.S.
customs and border crossings in order to enhance identity
checks. This RFID remained “on” by default, posing significant
privacy risks to the bearers. A Privacy by Design approach
argued that the default transmission setting for these cards
should be “off” until users chose to turn it on.
Sources: Transformative Technologies Deliver Both Security and Privacy: Think Positive-Sum
not Zero-Sum, March 2009; Video: A Word About RFIDs and your Privacy in the Retail Sector,
March 2006.
- 25 -
Principle 3
Actions Responsibility
1. Make a Privacy Risk Assessment an integral part
of the design stage of any initiative, e.g. when
Application &
designing the technical architecture of a system,
Program Owners
pay particular attention to potential unintended
uses of the personal information.
- 26 -
O perationalizing this Principle requires approaching design and development processes
throughout the organization in holistic, integrative and creative ways. Just as PbD
represents a shift in the way that organizations think about privacy – moving away from a
reactive model to a proactive one – enshrining PbD in regulatory instruments, voluntary codes
and best practices requires a shift in how law and policy-makers approach rule making. What
is invited is the development of innovative approaches to promoting and enshrining privacy
in various instruments.
What is essential is that all interests and objectives, including privacy, be clearly documented,
desired functions articulated, metrics agreed upon and applied, and trade-offs rejected as being
unnecessary, in favour of finding a solution that enables multi-functionality (see Principle 4:
Full Functionality – Positive Sum, not Zero-Sum).
At the same time, information security system standards and frameworks are being applied
today by enterprises, in greater numbers and with greater rigour, and Enterprise Architecture
design has burgeoned as a discipline, fuelled in part by regulatory and competitive pressures.
These information management efforts are consistent with, and can inform, Principle 3:
Privacy Embedded into Design.
Most important, even in scenarios where the target is an IT system or application, operationalizing
PbD cannot be viewed exclusively as just an IT project. Privacy expertise must be available and
engaged through all phases of the workflow, and bring with it a multi-faceted understanding
of privacy issues and requirements, and an appreciation of consumer/client expectations.
Depending on the nature of the project, there may be significant need for the competencies
of functional experts, risk managers, process experts, and other specialists.
I called 2011 the “Year of the Engineer.” In an effort to reach out to a wider spectrum of expert
participants, I gave talks almost exclusively to software engineers in 2011, in an effort to
engage a wide spectrum of software engineers, computer scientists, and technology developers
from around the globe. Together, we started a dialogue about translating the 7 Foundational
Principles of PbD into project requirements, procurements specifications, and positive-sum
operational results. I was truly heartened by the warm response I received from the engineers
I met with!
‘‘
Privacy by Design is a concept promoted by Ann Cavoukian, Ph.D.,
Information & Privacy Commissioner Ontario, Canada which aims
to promote the idea of systems and processes built with privacy in
mind, rather than retrofitted afterwards. I encourage all readers to
‘‘
browse her site which is quite informative, and gives you perhaps a
“bigger picture” view than IT alone.
Simon Hunt, Vice-President & Chief Technology Officer,
McAfee Data Protection.
- 27 -
Here are some illustrative examples of this work and the contributions to embedding PbD
into engineering design:
2. Embedding Privacy into Big Data Methods: A responsible “Big Data analytic
sensemaking” engine – Big Data is here and organizations want to leverage data analytics
to maximize this growing resource. While organizations have practical incentives to
make the most out of Big Data, we need to ensure that privacy is embedded into these
systems. Jeff Jonas shows us how embedding PbD is possible with his sensemaking
systems technology. We believe this design will guide others in the process of creating
their own next-generation analytics. This not only demonstrates that privacy can be
embedded into data analytics technologies but it can be done in a positive-sum manner.
The sensemaking technology has been designed to make sense of new observations
as they happen, fast enough to take action on them while the transaction is still
happening. Since its analytic methods, its capacity for Big Data
and its speed are game-changing, from a privacy perspective, it
has been designed from the ground up with privacy protections
in mind: i) full attribution, knowing the source of the data as
well as data tethering (any revisions of the data) are turned
on by default; ii) the analytics can be done on anonymized
data or what we call data minimization; iii) there is a tamper-
resistant audit logging feature that applies even to the database
administrator which enhances transparency and accountability;
iv) the false negative favouring methods reduce the number of
incorrect identifications that may have a significant impact on
civil liberties; v) self-correcting false positives advance greater
accuracy in identification; and vi) the inclusion of information
transfer accounting helps track secondary uses of the data. The
dynamic pace of technological innovation requires us to embed privacy into design in
a proactive manner – systems designers should be encouraged to practice responsible
innovation in the field of advanced analytics.
Source: Jeff Jonas, Co-author, (IBM) – Privacy by Design in the Age of Big Data, June 2012.
- 28 -
3. Embedding Privacy into Remote Surveillance Systems:
Ethical Technology in the Homes of Seniors (ETHOS) – A
project with Privacy by Design – minimize data; make
control meaningful; make control usable; and empower
– don’t overwhelm. This National Science Foundation-
funded, Indiana University-Bloomington interdisciplinary
team created a digital toolkit that enabled elders to maintain
their privacy, while taking full advantage of home-based
computing for their health and personal safety. Elders systematically underestimate
their electronic privacy risk. This project examined the role of information technology
in the homes of elders with an emphasis on design and evaluation for privacy. The
ETHOS team is creating tools that will help elders make appropriate decisions about
home-based computing and guide designers in creating privacy-respecting technologies.
Source: L. Jean Camp, Ph.D. – Respect by Design. Paper presented at “Privacy by Design: The
Gold Standard,” Toronto, Ontario, January 2010.
Source: Kai Rannenberg, Ph.D. – Privacy by Design in Mobile Applications and Location Based
Services. Paper presented at “Privacy by Design: The Gold Standard,” Toronto, Ontario, January 2010.
- 29 -
privacy while supporting access to individual-level data for research in the public interest.
It explores challenges presented by legislation, stewardship, and public perception and
demonstrates how PopData achieves both operational efficiencies and due diligence.
Source: Caitlin Pencarrick Hertzman, Nancy Meagher, Kimberlyn M McGrail – Privacy by Design
at Population Data BC: a case study describing the technical, administrative, and physical
controls for privacy-sensitive secondary use of personal information for research in the public
interest, August 2012.
Source: Ann Cavoukian & Klaus Kursawe – Implementing Privacy by Design: The Smart
Meter Case. Paper presented at “the IEEE International Conference on Smart Grid Engineering
(SGE’12),” Oshawa, Ontario (to be published).
Source: Caroline Winn, Co-author, (San Diego Gas & Electric) – Applying Privacy by Design
Best Practices to SDG&E’s Smart Pricing Program, March 2012.
- 30 -
8. Embedding Privacy into Mobile Technologies and
Ecosystems: These are examples of privacy design features
specific to the mobile industry:
• Design reporting features that allow the user to be notified of how data is being
collected, by what applications, and whether any exceptions to his/her privacy
preferences have occurred;
• Provide a simple, easy to understand user interface for such controls;
• Minimize applications’ access to device data; and
• Where practical, define privacy requirements and security standards for services
provided on the platform.
c) Network Providers:
• Integrate privacy into the development cycle, and practice data minimization techniques;
• Use privacy-protective default settings;
• Ensure end-to-end protection of user data;
- 31 -
• Maintain user awareness, and control of, data collection and use; and
• Design applications with privacy in mind.
Source: Marilyn Prosch, Co-author, (ASU Privacy by Design Research Lab) – The Roadmap for
Privacy by Design in Mobile Communications: A Practical Tool for Developers, Service Providers,
and Users, December 2010.
9. Embedding Privacy into Wireless Communications
Ecosystems: These are examples of the design requirements
set out for NFC technology deployment.
a) NFC Device Manufacturer: Consider the holistic,
platform-wide solution being provided and the privacy
design aspects that each component element in their
design adds to the overall solution, to avoid the false
assumption that privacy will be handled by some other
component within their solution (e.g., NFC data transfer
application assuming Bluetooth stack on the mobile
device will inform the user of details of the data to be
received).
b) NFC Application Developer: When creating
applications, especially within the peer-to-peer category, NFC application developers
should also be cautious about design elements that create a persistent linkage of
the NFC usage to the user or individual mobile device (e.g., MSISDN, IMEI, gamer
player identifier “XYZ,” etc.). The collection of personal information such as a unique
device identifier should be featured in the notification provided to users.
Source: Co-authored with Nokia – Mobile Near Field Communications (NFC) “Tap ‘n Go” – Keep
it Secure & Private, November 2011.
10. Embedding Privacy into Governance and Oversight Mechanisms: Embedding PbD
Principles into Regulatory Frameworks. This illustrates how addressing privacy proactively
may be embedded into the design of regulatory frameworks. PbD’s flexible, innovation-
driven approach to achieving privacy can help to encourage organizations to “internalize
the goal of privacy protection and to come up with ways to achieve it. This approach could
be advanced, for example, as part of a second generation regulatory
framework. In the complex, fast-moving information economy, this
strategy could be an effective way to enhance privacy protection.”
Under the influence of such a “second generation” approach,
incorporating the Principles of Privacy by Design, companies can
be encouraged to go beyond mere regulatory compliance with
notice, choice, access, security and enforcement requirements.
Instead, they can be empowered to design their own responsive
approaches to risk management and privacy-related innovation,
within the context of a policy or regulatory framework.
Source: With Foreword by Pamela Jones Harbour – Privacy by
Design in Law, Policy and Practice: A White Paper for Regulators,
Decision-makers and Policy-makers, August 2011.
- 32 -
Principle 4
Full Functionality –
Positive-Sum, not Zero-Sum
Operational Guidance: These actions seek to accommodate
legitimate interests and objectives in a positive-sum, ‘win-win’
manner, not through a zero-sum (win/lose) approach, where
unnecessary trade-offs to privacy are made. Avoid the pretense of
false dichotomies, such as privacy vs. security – demonstrate that
it is possible to have both.
Actions Responsibility
1. Acknowledge that multiple, legitimate business Leaders/Senior
interests must coexist. Management
Perhaps nowhere has this outdated, yet mainstream, way of thinking been more apparent
than in the area of public safety/security. This is where we see the classic zero-sum paradigm
writ large, with the view that the more we have of one interest (public security), the less we
can have of another (individual privacy). In this zero-sum framework, privacy can never win
out – the other interest advances, always at the expense of privacy.
Similarly, in health care, tensions exist between the need to have vital health-care information
readily available for treatment and care by health-care professionals yet at the same time,
carefully guarded as highly sensitive data. Respecting people’s privacy should never present
an impediment to the delivery of health-care services. Given the sensitive nature of health-
related information, these highly beneficial systems will only succeed if they are built with
privacy in mind – thereby delivering a positive-sum, doubly-enabling outcome.
Although each of the IPC’s repertoire of papers on Privacy by Design demonstrates the positive-
sum principle, for the purposes of illustration, a selected few are used to illustrate how this
principle is operationalized. By adopting a positive-sum paradigm and applying a privacy-
enhancing technology to a surveillance technology, you develop what I call “transformative
technologies.” Among other things, transformative technologies can literally transform
technologies normally associated with surveillance into ones that are no longer privacy-
invasive, serving to minimize the unnecessary collection, use and disclosure of personal data,
and promoting public confidence and trust in data governance structures.
Source: Privacy and Video Surveillance in Mass Transit Systems: A Special Investigation Report
– Privacy Investigation Report MC07-68, March 2008.
Source: Richard C. Alvarez, Co-author (Canada Health Infoway) – Embedding Privacy into the
Design of EHRs to Enable Multiple Functionalities – Win/Win, March 2012.
- 36 -
7. Protecting Smart Meter Consumer Energy Usage Data and Achieving Energy
Efficiency, Conservation, Reliability and Sustainability Objectives: Armed with an
understanding of where privacy issues are likely to arise in the Smart Grid, regulators
can help utilities understand privacy through the lens of a positive-sum, rather than a
zero-sum, paradigm. When operating in this paradigm, utilities may believe that privacy
interferes with other operational goals of the Smart Grid. Looking at privacy through the
lens of a positive-sum paradigm, it becomes clear that a win-win situation is possible. The
Smart Grid can achieve all of its objectives AND provide strong privacy for consumers.
Indeed, designing privacy protections into the Smart Grid need not weaken security or
functionality – it can, in fact, enhance the overall design of the system.
a) Understand – Are Smart Grid projects being planned in your jurisdiction? Which
utility companies are involved? Who are the market leaders and what is their vision?
Familiarize your office with the essentials.
b) Engage – Find the key people involved with the Smart Grid in your local utilities.
Determine their level of understanding, educate them and open a dialogue
about privacy.
Source: Shaping Privacy on the Smart Grid – You Can Make a Difference: A Roadmap for Data
Protection Commissioners and Privacy Regulators, October 2010.
- 37 -
Principle 5
End-to-End Security –
Full Lifecycle Protection
Operational Guidance: Security is the key to privacy. These actions
ensure cradle-to-grave, lifecycle management of information,
end-to-end, so that at the conclusion of the process, all data
are securely destroyed, in a timely fashion.
Actions Responsibility
1...Employ encryption by default to mitigate the
security concerns associated with the loss, theft Software Engineers &
or disposal of electronic devices such as laptops, Developers
tablets, smartphones, USB memory keys and
other external media. The default state of data,
Application &
if breached, must be “unreadable.”
Program Owners
2...Deploy encryption correctly and carefully integrate
it into devices and workflows in an automatic
Line of Business &
and seamless manner.
Process Owners
3. .Ensure the secure destruction and disposal of
personal information at the end of its lifecycle.
- 38 -
E nd-to-end security seeks the highest standard of data security possible. Organizations must
assume responsibility for the security of personal information (including confidentiality,
integrity and availability) throughout its entire lifecycle (at rest, in transit, while in use), consistent
with the international standards that have been developed by recognized standards development
organizations. Data security is essential to information privacy but does not equal privacy.
Information security may be compared to a chain – it is only as strong as its weakest link.
- 39 -
f) Encryption by default – Once an encryption system has been installed on a mobile
device or to protect mobile media, users should be able to rely on the encryption being
in place without having to explicitly activate it to protect data.
All of the above considerations apply when encryption is used to secure the data stored on
mobile devices and media such as laptops, cellphones, portable hard drives and memory
sticks. They also apply to encryption used as an integral part of secure communications such
as virtual private networks, secure email systems, and secure Web access. However, there is
a final functional consideration when entire IT infrastructures are being designed and built:
h) Threat and Risk Assessment – IT infrastructures that use security technologies such as
encryption should be subjected to a Threat and Risk Assessment prior to live operations
(and preferably prior to implementation) to ensure that they work as expected.
Source: Fact Sheet 16: Health-Care Requirement for Strong Encryption, July 2010.
‘‘
This is amazing. Every time I see something like this, it makes
me sad that the U.S. doesn’t have anything like your office. The
Commissioner has yet again shown bold leadership in the privacy
space. I can only hope that the major Web 2.0 companies listen
‘‘
to her, and embrace the philosophy of .Privacy By Design. Pat
yourselves on the back for doing a great job.
Christopher Soghoian,
formerly with Berkman Centre for Internet & Society,
Harvard University
- 40 -
2. In a Cloud Computing environment, a consumer (individual
or enterprise) may choose to encrypt all personal or
otherwise sensitive data both while the data is stored on
a Cloud service provider’s servers (at rest) and while being
transmitted to end-users (in motion) – along with, of course,
appropriate protections while the data is in use. Encrypting
consumer data prior to outsourcing to the Cloud is at the
heart of the architecture proposed in a Cloud Computing
paper co-written with NEC, along with systems to ensure
appropriate access to data is not reduced.
3. Analyzing Encrypted Data For Insights: In a paper written for an IEEE conference on
the Smart Grid, we proposed using what is known as a “Fully Homomorphic Encryption
Scheme” that allows users to hand off the processing of data to a vendor without giving
away access to that data. The technique adds an important layer of safety and privacy
to the online world in settings ranging from banking and health care to networks and
Cloud computing. This significant research was recognized by the IPC through the
PET Symposium award for innovative research in privacy and security to Craig Gentry,
IBM. The work of Dr. Khaled El Emam also involves a protocol that uses an additive
homomorphic encryption system allowing mathematical operations to be performed
on encrypted values. This is in conjunction to his continuing work on de-identification
and health research data.
Sources: Award winner’s breakthrough efforts reveal how technology can lock-in privacy:
Commissioner Ann Cavoukian, July 2010; to be published Ann Cavoukian & Klaus Kursawe
– Implementing Privacy by Design: The Smart Meter Case. Paper presented at “the IEEE
International Conference on Smart Grid Engineering (SGE’12),” Oshawa, Ontario and Khaled
El Emam,et al – (2012) A secure distribution logistic regression protocol for the detection of rare
adverse drug events, Journal of the American Medical Informatics Association.
- 41 -
centralized security policy enforcement ‘suite’, which may also include port and plug-in
device control with auto-encryption options) or as a stand-alone ‘end point solution’
applied on a case by case basis.
Source: Co-authored with Jeff Curtis (Sunnybrook Health Sciences), Nandini Jolly (CryptoMill
Technologies) – Encryption by Default and Circles of Trust, December 2012.
- 42 -
• In determining the method of destruction, organizations should consider the medium
of the record, whether the records require a stronger method of destruction based on
their sensitivity, and whether the media will be reused internally or moved out of the
organization.
• Neither recycling records nor simply placing them in the trash are acceptable methods
of destruction – avoid both.
• Before employing a service provider that will securely destroy all records,
organizations should develop criteria for choosing a provider, as well as confirming
the provider’s methods of destruction and how records will be securely transported
to the provider selected.
• Organizations should sign a contract or formal agreement with all external service
providers hired to destroy records.
• Once materials are securely destroyed, they should be restricted from public access
until disposed of permanently.
• Organizations should audit their secure destruction programs to ensure employee and
service provider compliance.
Source: Robert Johnson, Co-author (National Association for Information Destruction (NAID))
– Get rid of it Securely to keep it Private: Best Practices for the Secure Destruction of Personal
Health Information, October 2009.
- 43 -
Principle 6
Actions Responsibility
1. Make the identity and contact information of the
individual(s) responsible for privacy and security
available to the public and well known within
the organization.
Leadership/Senior
2...Implement a policy that requires all “public-
Management
facing” documents to be written in “plain
language” that is easily understood by the
individuals whose information is the subject of
the policies and procedures. Software Engineers
- 44 -
V isibility and transparency are essential to establishing accountability and trust – not only
for individuals but also for business partners, regulators and other involved stakeholders. It
is increasingly in the interests of everyone – from application developers to systems architects,
as well as organizational leadership – to be able to demonstrate effective privacy due diligence,
especially in the event of a breach, a complaint, or an external audit. The long-term audit
requirements imposed by FTC settlements are evidence of heightened expectations in this
realm. Here in Ontario, personal health data registries must similarly sign affidavits every
three years to confirm that they are adhering to minimum policies and practices.
You can outsource services, but you cannot outsource accountability. There are also growing
demands for audit rights in contracts, and for concrete evidence of adherence to standards,
contracts, and laws. Privacy metrics are essential. Standardized processes and third party
privacy seals or marks of review, approval and certification may also be useful. In 2007,
EuroPriSe introduced a European Privacy Seal for IT-products and IT-based services that have
proven privacy compliant under European data protection laws in a two-step independent
certification procedure. More recently, in October 2012, The Future of Privacy Forum and
TRUSTe launched a Smart Grid Privacy Seal Program.
The implementation of Privacy by Design also opens up a stream of dialogue, not only within
organizations, but also between organizations and the customers they serve. The importance
of this dialogue cannot be overstated – effective communication with end-users is the essential
link between implementing strong privacy practices and fostering the consumer confidence
and trust that leads to sustainable competitive advantage. Further, it enables privacy leaders
to earn the recognition they deserve.
For this reason, it is essential that important privacy attributes about a system or process
be brought to users’ attention in relevant, timely and actionable ways. It should be relatively
simple for users to find out critical privacy details about a technology or information system
and how it operates. Clear documentation that is easily understood and that provides a
reasonable and sufficient basis for informed decision-making must be provided.
There is widespread consensus that the prevailing Notice and Choice approach to user
privacy is deeply flawed. Users rarely read the lengthy and legalistic “take it or leave it”
policies and terms they are often presented with. Organizations that rely on such policies
are mistaken if they believe that consumers have seen, understood or knowledgably accepted
their privacy practices.
Whether installing a new application, or interacting with a website and social networking
platform, users need to be well informed about important system privacy attributes, including,
at a minimum, what privacy policies apply and who is responsible for them.
In applying this principle, it is useful to bear in mind that the way we interact with devices
is constantly changing. Considerable research and experimentation is being undertaken into
Human-Computer Interface (HCI) design to improve user awareness and understanding. Other
- 45 -
potentially relevant approaches that are being explored include standardized short notices
and machine-readable privacy policies.
The following provide examples of how this principle is implemented. Also, refer to the
accountability tools and examples provided under Principle 1: “Proactive not Reactive;
Preventative not Remedial.”
CCTV/Video Surveillance
• The public should be notified, using clearly written
signs, prominently displayed at the perimeter of the
video surveillance areas of video surveillance equipment
locations, so the public has reasonable and adequate
warning that surveillance is, or may be in operation, before
entering any area under video surveillance. Signs at the
perimeter of the surveillance areas should identify someone
who can answer questions about the video surveillance
system, and can include an address, telephone number,
or website for contact purposes.
• Organizations should ensure that the use and security of video surveillance equipment
is subject to regular audits. The audit should also address the organization’s compliance
with the operational policies and procedures. An external body may be retained in
order to perform the audit. Any deficiencies or concerns identified by the audit must
be addressed immediately.
o In the 2008 TTC Privacy Investigation Report (MC07-68) one of the recommendations
pertains to audits (“The TTC must ensure that its video surveillance program is
subjected to an effective and thorough audit conducted by an independent third
party, using the GAPP Privacy Framework.”)
Source: Guidelines for the Use of Video Surveillance Cameras in Public Places, September 2007.
- 46 -
Radio Frequency Identification (RFID)
Accountability: An organization is responsible for personal information under its control and
should designate a person who will be accountable for the organization’s compliance with the
following principles, and the necessary training of all employees. Organizations should use
contractual and other means to provide a comparable level of protection if the information
is disclosed to third parties. Organizations that typically have the most direct contact and
primary relationship with the individual should bear the strongest responsibility for ensuring
privacy and security, regardless of where the RFID-tagged items originate or end up in the
product lifecycle.
Openness:
• Organizations should publish, in compliance with applicable
laws, information on their policies respecting the collection,
retention, and uses of RFID-linked consumer information.
• Signs at the perimeter should identify someone who can answer questions about the
RFID system, and include their contact information.
• Consumers should always know when, where, and why an RFID tag is being read. Visual or
audio indicators should be built into the operation of the RFID system for these purposes.
Challenging Compliance:
• Organizations should inform consumers of their rights and available procedures to
challenge that business’ compliance with these privacy principles.
• Organizations may wish to ensure that the use and security of any RFID technology or
system is subject to regular audits. For example, the audit could address the company’s
compliance with the operational policies and procedures.
Source: Privacy Guidelines for RFID Information Systems (RFID Privacy Guidelines), June 2006.
- 47 -
For application of these guidelines to the health sector, see RFID
and Privacy: Guidance for Health-Care Providers, January 2008.
The essential purpose of this publication is to assist the health-care
sector in understanding the current and potential applications of
RFID technology, its potential benefits, privacy implications, and
the steps that can be taken to mitigate potential threats to privacy.
Source: Short Notices to the Public under the Personal Health Information Protection Act: Your
Health Information and Your Privacy in Our Office; Your Health Information and Your Privacy
in Our Hospital; Your Health Information and Your Privacy in Our Facility, June 2005.
- 48 -
Principle 7
Actions Responsibility
1. Offer strong privacy defaults.
Leadership/Senior
2. Provide appropriate notice.
Management
3. Consider user-friendly options:
Software Engineers &
a. Make user preferences persistent and Developers
effective.
Application &
b. Provide users with access to data about
Program Owners
themselves.
- 49 -
A t its core, respecting the user means that when designing or deploying an information
system, an individual’s privacy and user interests are accommodated, right from the
outset. User-centricity is designing for the user, anticipating his or her privacy perceptions,
needs, requirements, and default settings.
Operational aspects of this principle include measures to assure transparency, attain informed
user consent, provide rights of access and correction, and make effective redress mechanisms
available. Users expect privacy preferences and settings to be clear, to function across platforms,
and to persist over time. Preferences and settings should also cascade to third parties (e.g.
opt-out). Robust consent mechanisms have significant uses in the Cloud, social and mobile
computing applications, online tracking and advertising services, online contracts, electronic
health records, and personal data vaults. These issues are being examined by industry and
public policy-makers. Organizational policies and processes should demonstrate the same
degree of consideration for users at all touch points and interactions.
There must be a way for users to gain insight into the operations and functioning of any
technology or system that they are interacting with, preferably in real time. User controls
should be timely and actionable, and supported by appropriate feedback. Defaults should be
set appropriately, by which we mean in the most privacy-protective manner possible.
The concept of “user-centricity” has evolved into two sometimes contradictory meanings in
networked or online environments. As it pertains to privacy, it contemplates a right of control
by an individual over his or her personal information when online, usually with the help of
technology. For most system designers, however, it describes an information and communications
system built with users in mind, which anticipates and addresses their privacy interests, risks
and needs. One view is libertarian (informational self-determination); the other is somewhat
paternalistic. Both views are valid, but must be qualified in the Information Age. Privacy by
Design embraces both understandings of user-centricity. Information technologies, processes
and infrastructures must be designed not only for individual users, but also structured by
them. Users are rarely, if ever, involved in every design decision or transaction involving
their personal information, but they are nonetheless in an unprecedented position today to
exercise a measure of meaningful control over those designs and transactions, as well as the
disposition and use of their personal information by others. As with the other principles of
Fair Information Practices and Privacy by Design, Respect for User Privacy is not a stand-
alone principle. It is intertwined with the remaining principles (e.g., on transparency, security
safeguards, default settings, embedding privacy, and achieving positive-sum results).
- 50 -
of user interface design. The user interface is that dimension of the system by which users
interact with a machine. It includes hardware (physical) and software (logical) components.
Individual access rights are enshrined in most public sector privacy laws and practices. Today,
an access revolution has been occurring in Cloud, mobile and social computing contexts.
Online account management is common, and people expect direct access to personal data
held about them, especially when there is a privacy breach.
In this spirit, my office applauded the launch of Google Dashboard, which gave users
unprecedented visibility, insight and control into the collection, use and disclosure of their
personal information across Google’s services. Indeed, generally speaking, we have been
supportive of any user-controlled devices, agents, and platforms that allows maximum user
control over personal data and its processing by others, such as personal health records, data
vaults, and ultimately SmartData.
The following are examples of mechanisms that take a user-centric approach to privacy:
Source: Privacy and Government 2.0: The Implications of an Open World, May 2009.
- 51 -
2. Building Privacy into User Interface Design: Through
this joint paper with Yahoo!, we contributed to a deepening
evidence base that the privacy and policy community
can draw upon in future work that exemplifies user-
centricity. Good product and business process designs
are needed to empower users to achieve strong privacy.
Effective user interfaces are critical to good design and
operation. General user interface (UI) or user experience
(UX) design (“UID/UXD”) theory and evaluation criteria
continue to evolve with 21st century technologies. The
application of UI/UX design principles to the online
environment and user privacy experience represents a
subset of a much larger field of inquiry. Context matters
greatly in how design principles and criteria are applied.
Legal requirements, project domain and scope, objectives to be achieved, and the
nature, volume and sensitivity of the personal data processing involved will all vary
in influence, along with the extent of user participation. Context must inform sound
decision-making, and must therefore be the cornerstone of sound design. Adaptation
to a privacy context requires taking a principled approach, executing judgement, and
considering some form of metrics.
Source: Justin B. Weiss, Co-author (Yahoo!) – Privacy by Design and User Interfaces:
Emerging Design Criteria - Keep it User-Centric, June 2012.
Source: Co-authored with Shane Green, Josh Galper (Personal), Drummond Reed (Respect
Network), Liz Brandt, Alan Mitchell (Ctrl-Shift), – Privacy by Design and the Emerging Personal
Data Ecosystem, October 2012.
- 53 -
5. S m a r t D a t a . P r i v a c y M e e t s
Evolutionary Robotics: Protecting
Freedom Using Virtual Tools. Technology
must form an integral component in the
defence of our personal privacy. Policies
and regulations will serve, at best,
as lagging remedies in the fast-paced
world of cyberspace. In a world where
personal information can increasingly
be transmitted and used in multiple
locations simultaneously, protecting
privacy may only truly be accomplished
if the information itself becomes
“intelligent” and capable of making
appropriate decisions, relating to its
release, on behalf of the data-subject.
In other words, the data must become
“smart” – hence, we need SmartData.
This research at the Identity, Privacy and
Security Institute at the University of
Toronto looks into the growing need, the
challenges, and ultimately, the benefits
of developing virtual, intelligent agents to protect our privacy online.
Source: George Tomko, Ph.D., Donald Borrett, Ph.D., Hon C. Kwan, Ph.D., & Greg Steffan, Ph.D.,
SmartData: Make the data “think” for itself. Data Protection for the 21st Century, February 2010.
- 54 -
Conclusions
T his paper provided an overview of some of the work that my office has been engaged in over
the years, and the experiences of our innovative partners in these efforts to give meaningful
operational effect to the principles of Privacy by Design. By reflecting on the work of many
international companies and organizations, I hope to encourage readers to create their own paths.
Our work is far from complete – in fact, it has just begun. There is a long road ahead in the
journey of translating PbD’s 7 Foundational Principles into concrete, prescriptive requirements,
specifications, standards, best practices, and operational performance criteria. It is a journey
that must, by necessity, involve not only executives, but especially software engineers and
designers, risk managers, marketing and customer service professionals, legal departments,
project managers, privacy officers, and many others. It must also encompass business
requirements, engineering specifications, development methodologies, and security controls,
according to each domain or project scope.
There are already a number of initiatives underway that represent the concrete steps taken
toward operationalizing PbD and making its implementation part of the default rules for the next
generation of privacy advocates who will be tasked with responding to the new challenges we
will face. One exciting development is a new Technical Committee of the international standards
body OASIS (the Organization for the Advancement of Structured Information Standards) – PbD-
SE (Software Engineers), to develop and promote standards for PbD in software engineering,
that I am co-chairing with Dr. Dawn Jutla, a professor of engineering at St. Mary’s University,
Nova Scotia. She is the winner of the prestigious U.S. World Technology Award (IT Software –
Individual 2009) and is recognized for her innovative work with long-term significance on the
evolving technological landscape as well as the transcendent imperative of privacy protection. At
Carnegie Mellon University, Professors Lorrie Faith Cranor and Norman Sadeh have developed
a new graduate program combining engineering and privacy – a Master’s program in “Privacy
Engineering.” A major element of the program is a PbD “learn-by-doing” component.
As I mentioned at the outset of this paper, the exercise of operationalizing Privacy by Design
– taking it from principles to actions – is one that each organization will undertake in its own
way. It is my hope that as they do so, they will make their own stories – their challenges,
victories, and lessons learned – broadly available so that the privacy community may continue
to build much-needed expertise, and grow best practices, for the benefit of all. I have always
said that Privacy by Design is not a theoretical construct or academic formulation – it has to
have legs, on the ground, now, in order to be effective. Together, we can make the concept of
privacy a reality, by design – now, and well into the future.
- 55 -
Appendices
• Privacy and Video Surveillance in Mass Transit Systems: A Special Investigation Report,
Dr. Ann Cavoukian, March 2008. http://www.ipc.on.ca/images/Findings/mc07-
68-ttc_592396093750.pdf
• Biometric Encryption Chapter from the Encyclopedia of Biometrics, Dr. Ann Cavoukian and
Alex Stoianov, Ph.D., December 2009. http://www.ipc.on.ca/images/Resources/
bio-encrypt-chp.pdf
• Fingerprint Biometric Systems: Ask the Right Questions Before You Deploy, Dr. Ann
Cavoukian, July 2008. http://www.ipc.on.ca/images/Resources/fingerprint-
biosys.pdf
• Fingerprint Biometrics: Address Privacy Before Deployment, Dr. Ann Cavoukian, November
2008. http://www.ipc.on.ca/images/Resources/fingerprint-biosys-priv.pdf
- 56 -
Smart Meters and the Smart Grid
• SmartPrivacy for the Smart Grid: Embedding Privacy into the Design of Electricity
Conservation, Office of the Information & Privacy Commissioner and the Future of
Privacy Forum, November 2009. http://www.ipc.on.ca/images/Resources/pbd-
smartpriv-smartgrid.pdf
• Privacy by Design: Achieving the Gold Standard in Data Protection for the Smart Grid,
Office of the Information & Privacy Commissioner, Hydro One Networks and Toronto
Hydro Corp., June 2010. http://www.ipc.on.ca/images/Resources/achieve-
goldstnd.pdf
• Frequently Asked Questions – Smart Grid Privacy – From Smart Meters to the Future,
Dr. Ann Cavoukian, October 2010. http://www.ipc.on.ca/images/Resources/
smartgrid-faq.pdf
• Operationalizing Privacy by Design: The Ontario Smart Grid Case Study, Office of the
Information & Privacy Commissioner, Hydro One, GE, IBM and Telvent, February 2011.
http://www.ipc.on.ca/images/Resources/pbd-ont-smartgrid-casestudy.pdf
• Applying Privacy by Design Best Practices to SDG&E’s Smart Pricing Program, Office of
the Information & Privacy Commissioner and San Diego Gas & Electric, March 2012.
http://www.ipc.on.ca/images/Resources/pbd-sdge.pdf
• Smart Meters in Europe: Privacy by Design at its Best, Dr. Ann Cavoukian, April 2012.
http://www.ipc.on.ca/images/Resources/pbd-smartmeters-europe.pdf
• Building Privacy into Ontario’s Smart Meter Data Management System: A Control
Framework, Office of the Information & Privacy Commissioner and the Independent
Electricity System Operator, May 2012. http://www.ipc.on.ca/images/Resources/
pbd-ieso.pdf
• Shaping Privacy on the Smart Grid – You can Make a Difference: A Roadmap for
Data Protection Commissioners and Privacy Regulators, Dr. Ann Cavoukian, October
2010. http://www.privacybydesign.ca/content/uploads/2010/10/2010-10-
roadmap_brochure.pdf
• Smart Grid Privacy 101: A Primer for Regulators, Dr. Ann Cavoukian, October 2010. http://
www.privacybydesign.ca/content/uploads/2010/10/smart-grid-primer.pdf
• Embedding Privacy into Smart Grid Initiatives, Dr. Ann Cavoukian, October 2010. http://
www.privacybydesign.ca/content/uploads/2010/10/smartgrid-tipsheet.pdf
- 57 -
• Wi-Fi Positioning Systems: Beware of Unintended Consequences, Dr. Ann Cavoukian
and Kim Cameron, June 2011. http://www.ipc.on.ca/images/Resources/wi-fi.pdf
• Safeguarding Personal Health Information When Using Mobile Devices for Research
Purposes, Office of the Information & Privacy Commissioner and the Children’s Hospital
of Eastern Ontario, September 2011. http://www.ipc.on.ca/images/Resources/
cheo-mobile_device_research.pdf
• Practical Tips for Implementing RFID Privacy Guidelines, Office of the Information &
Privacy Commissioner, June 2006. http://www.ipc.on.ca/images/Resources/
up-rfidtips.pdf
• RFID and Privacy – Guidance for Health-Care Providers, Office of the Information &
Privacy Commissioner and Hewlett-Packard Canada, January 2008. http://www.ipc.
on.ca/images/Resources/rfid-HealthCare.pdf
• Adding an On/Off Device to Activate the RFID in Enhanced Driver’s Licenses, Dr. Ann
Cavoukian, March 2009. http://www.ipc.on.ca/images/Resources/edl.pdf
• Remote Home Health Care Technologies: How to Ensure Privacy? Build It In:
Privacy by Design, Office of the Information & Privacy Commissioner, Intel and GE
Healthcare, November 2009. http://www.ipc.on.ca/images/Resources/pbd-
remotehomehealthcarew_Intel_GE.pdf
• Fact Sheet /16 – Health-Care Requirement for Strong Encryption, Dr. Ann Cavoukian,
July 2010. http://www.ipc.on.ca/images/Resources/fact-16-e.pdf
- 58 -
• Sensors and In-Home Collection of Health Data: A Privacy by Design Approach, Office
of the Information & Privacy Commissioner and the Intelligent Assistive Technology
and Systems Lab, August 2010. http://www.ipc.on.ca/images/Resources/pbd-
sensor-in-home.pdf
• Transformative Technologies Deliver Both Security and Privacy: Think Positive-Sum not
Zero-Sum, Dr. Ann Cavoukian, July 2008. http://www.ipc.on.ca/images/Resources/
trans-tech.pdf
• Moving Forward from PETs to PETs Plus: The Time for Change is Now, Dr. Ann Cavoukian,
January 2009. http://www.ipc.on.ca/images/Resources/petsplus_3.pdf
• Privacy by Design: The 7 Foundational Principles, Dr. Ann Cavoukian, August 2009.
http://www.ipc.on.ca/english/Resources/Discussion-Papers/Discussion-
Papers-Summary/?id=883
• Privacy by ReDesign: Building a Better Legacy, Dr. Ann Cavoukian and Professor Marilyn
Prosch, May 2011. http://www.ipc.on.ca/images/Resources/AVAwhite6.pdf
• Privacy by Design Curriculum v2.0, Dr. Ann Cavoukian, November 2011. http://
privacybydesign.ca/publications/
- 59 -
Privacy by Design Papers Organized by Principle
1. Proactive not Reactive; Preventative not Remedial
• Cavoukian, Ann, Building Privacy into Ontario’s Smart Meter Data Management System:
A Control Framework (Office of the Information and Privacy Commissioner, Ontario,
Canada and Ontario, May 2012), www.ipc.on.ca/images/Resources/pbd-ieso.pdf
• Cavoukian, Ann, Prosch, Marilyn. The Roadmap for Privacy by Design in Mobile
Communications: A Practical Tool for Developers, Service Providers, and Users (Office of
the Information and Privacy Commissioner, Ontario, Canada, December 2010), http://
www.ipc.on.ca/images/Resources/pbd-asu-mobile.pdf
• Cavoukian, Ann, Mobile Near Field Communications (NFC) “Tap ‘n Go” - Keep it Secure
and Private (Office of the Information and Privacy Commissioner, Ontario, Canada,
November 2011) http://www.ipc.on.ca/images/Resources/mobile-nfc.pdf
• Cavoukian, Ann, Privacy and Boards of Directors: What You Don’t Know Can Hurt You
(Office of the Information and Privacy Commissioner, Ontario, Canada, November 2003),
http://www.ipc.on.ca/images/Resources/director.pdf
• A Policy is Not Enough: It Must be Reflected in Concrete Practices (Office of the Information
and Privacy Commissioner, Ontario, Canada, September 2012), http://www.ipc.
on.ca/images/Resources/pbd-policy-not-enough.pdf
• Jesselon, Pat. & Fineberg, Anita. The Privacy by Design Privacy Impact Assessment
(The PbD-PIA), (Office of the Information and Privacy Commissioner, Ontario, Canada,
April 2011), http://privacybydesign.ca/content/uploads/2011/11/PbD-PIA-
Foundational-Framework.pdf
• Cavoukian, Ann, The New Federated Privacy Impact Assessment (F-PIA) Building Privacy
and Trust-enabled Federation (Office of the Information and Privacy Commissioner,
Ontario, Canada, January 2009), http://www.ipc.on.ca/English/Resources/
Discussion-Papers/Discussion-Papers-Summary/?id=836
• Wright, D., & de Hert, P. (2012). Privacy Impact Assessment. Law, Governance and
Technology Services, Vol 6.
• Cavoukian, Ann., Abrams, E. Martin, Taylor, Scott., Privacy by Design: Essential for
Organizational Accountability and Strong Business Practices (Office of the Information
and Privacy Commissioner, Ontario, Canada, November 2009), http://www.ipc.on.ca/
images/Resources/pbd-accountability_HP_CIPL.pdf
• Cavoukian, Ann, OLG, YMCA, Privacy Risk Management: Building privacy protection into a
Risk Management Framework to ensure that privacy risks are managed, by default (Office
of the Information and Privacy Commissioner, Ontario, Canada, April 2010), http://
www.privacybydesign.ca/publications/accountable-business-practices/
- 60 -
• Office of the Information and Privacy Commissioner of Ontario and IBM, Privacy by
Design: From Policy to Practice (Office of the Information and Privacy Commissioner,
Ontario, Canada, September 2011), http://www.ipc.on.ca/images/Resources/
pbd-policy-practice.pdf
• Cavoukian, Ann. Adding an On/Off Device to Activate the RFID in Enhanced Driver’s
Licences: Pioneering a Made-in-Ontario Transformative Technology that Delivers Both
Privacy and Security (Office of the Information and Privacy Commissioner, Ontario,
Canada, March 2009), http://www.ipc.on.ca/images/Resources/edl.pdf
• Office of the Information and Privacy Commissioner Ontario, Video: A Word about
RFIDs and Your Privacy in the Retail Sector. (Office of the Information and Privacy
Commissioner, Ontario, Canada, March, 2006), http://www.ipc.on.ca/english/
Resources/Educational-Material/Educational-Material-Summary/?id=663
• Cavoukian, Ann, White Paper: Anonymous Video Analytics (AVA) technology and
privacy (Office of the Information and Privacy Commissioner, Ontario, Canada, April
2011), www.ipc.on.ca/images/Resources/AVAwhite6.pdf
- 61 -
• Cavoukian, Ann, Transformative Technologies Deliver Both Security and Privacy: Think
Positive-Sum not Zero-Sum. (Office of the Information and Privacy Commissioner,
Ontario, Canada, July 2008), http://www.ipc.on.ca/images/Resources/trans-
tech-handout_098824173750.pdf
• Cavoukian, Ann., & Winn, Caroline., Applying Privacy by Design Best Practices to SDG&E’s
Smart Pricing Program (Office of the Information and Privacy Commissioner Ontario,
Canada, March 2012), http://www.ipc.on.ca/images/Resources/pbd-sdge.pdf
• Rannenberg, K. (2010) Privacy by Design in Mobile Applications and Location Based Services.
Paper presented at “Privacy by Design: The Gold Standard,” Toronto, Ontario http://www.
privacybydesign.ca/content/uploads/2010/03/PbD_in_Mobile_Applications_
and_Location_Based_Services.20100128.Rannenberg.20100127.2.pdf
• Cavoukian, Ann, Prosch, Marilyn. The Roadmap for Privacy by Design in Mobile
Communications: A Practical Tool for Developers, Service Providers, and Users (Office of
the Information and Privacy Commissioner, Ontario, Canada, December 2010), http://
www.ipc.on.ca/images/Resources/pbd-asu-mobile.pdf
• Cavoukian, Ann, Mobile Near Field Communications (NFC) “Tap ‘n Go” - Keep it Secure
and Private (Office of the Information and Privacy Commissioner, Ontario, Canada,
November 2011) http://www.ipc.on.ca/images/Resources/mobile-nfc.pdf
• Cavoukian, Ann., Jonas, Jeff. Privacy by Design in the Age of Big Data (Office of the
Information and Privacy Commissioner, Ontario, Canada, June 2012), www.ipc.on.ca/
english/Resources/Discussion-Papers/Discussion-Papers-Summary/?id=1195
• Cavoukian, Ann., & Cameron, Kim. Wi-Fi Positioning Systems: Beware of Unintended
Consequences (Office of the Information and Privacy Commissioner, Ontario, Canada,
June 2011) http://www.ipc.on.ca/images/Resources/wi-fi.pdf
• Cavoukian, Ann, Privacy by Design in Law, Policy and Practice: A White Paper for
Regulators, Decision-makers and Policy-makers (Office of the Information and Privacy
Commissioner, Ontario, Canada, Aug 2011) http://www.ipc.on.ca/images/
Resources/pbd-law-policy.pdf
- 62 -
• Federal Trade Commission, (2012) Protecting Consumer Privacy in an Era of Rapid
Change, ftc.gov/os/2012/03/120326privacyreport.pdf p. 53
• Cavoukian, Ann, Privacy and Video Surveillance in Mass Transit Systems: A Special
Investigation Report MC07-68, (Office of the Information and Privacy Commissioner,
Ontario, Canada, March 2008), http://www.ipc.on.ca/images/Findings/mc07-
68-ttc_592396093750.pdf
• Office of the Information & Privacy Commissioner of Ontario, Shaping Privacy on the Smart
Grid - You Can Make a Difference: A Roadmap for Data Protection Commissioners and Privacy
Regulators (Office of the Information and Privacy Commissioner, Ontario, Canada, October
2010), http://www.privacybydesign.ca/content/uploads/2010/10/2010-10-
roadmap_brochure.pdf?search=search
• Alvarez, C. Richard., Cavoukian, Ann, Embedding Privacy into the Design of EHRs to Enable
Multiple Functionalities - Win/Win (Office of the Information and Privacy Commissioner,
Ontario, Canada, March 2012). www.ipc.on.ca/images/Resources/2012-03-02-
PbD-EHR.pdf
• Cavoukian, Ann., Hoffman, David., & Killen, Scott. Remote Home Health Care Technologies:
How to Ensure Privacy? Build It In: Privacy by Design (Office of the Information and
Privacy Commissioner, Ontario, Canada, November 2009), http://www.ipc.on.ca/
images/Resources/pbd-remotehomehealthcarew_Intel_GE.pdf
• Cavoukian, Ann., Mihailidis, Alex., Boger, Jennifer., Sensors and In-Home Collection
of Health Data: A Privacy by Design Approach (Office of the Information and Privacy
Commissioner, Ontario, Canada, August 2010), http://www.ipc.on.ca/images/
Resources/pbd-sensor-in-home.pdf
- 63 -
• Cavoukian, Ann. Abandon Zero-Sum, Simplistic either/or Solutions - Positive-sum is
Paramount: Achieving Public Safety and Privacy Concept (Office of the Information and
Privacy Commissioner, Ontario, Canada, November 2012) http://www.ipc.on.ca/
images/Resources/pbd-ctc.pdf
• “Fact Sheet 16: Health-Care Requirement for Strong Encryption” (Office of the Information
and Privacy Commissioner, Ontario, Canada, July 2010), http://www.ipc.on.ca/
images/Resources/fact-16-e.pdf
• Curtis, Jeff.,& Jolly, Nandini., Encryption by Default and Circle of Trust, (Office of the
Information and Privacy Commissioner, Ontario, Canada, November 2012)
• Cavoukian, Ann., & Johnson, Robert. Get rid of it Securely to keep it private: Best practices
for the secure destruction of personal health information (Office of the Information and
Privacy Commissioner, Ontario, Canada, October 2009), http://www.ipc.on.ca/
images/Resources/naid.pdf
• Cavoukian, Ann, If You Want To Protect Your Privacy, Secure Your Gmail, (Office of the
Information and Privacy Commissioner, Ontario, Canada, July 2009), http://bit.ly/COvz3
• Cavoukian, Ann., Zeng, Ke, Modelling Cloud Computing Architecture Without Compromising
Privacy: a Privacy by Design Approach (Office of the Information and Privacy Commissioner,
Ontario, Canada, May 2010), http://bit.ly/aVjFBC
• Cavoukian, Ann. Privacy Guidelines for RFID Information Systems (RFID Privacy
Guidelines), (Office of the Information and Privacy Commissioner, Ontario, Canada,
June 2006), http://www.ipc.on.ca/images/Resources/rfid-guides&tips.pdf
• Cavoukian, Ann., Garcia, Victor. RFID and Privacy – Guidance for Health-Care Providers,
(Office of the Information & Privacy Commissioner, Ontario, Canada, January 2008),
http://www.ipc.on.ca/images/Resources/rfid-HealthCare.pdf
- 64 -
• Short Notice to the Public under the Personal Health Information Protection Act: Your
Health Information and Your Privacy in Our Office (Office of the Information and
Privacy Commissioner, Ontario, Canada, June 2005) http://www.ipc.on.ca/index.
asp?navid=46&fid1=257&fid2=2
• Short Notice to the Public under the Personal Health Information Protection Act: Your
Health Information and Your Privacy in Our Hospital (Office of the Information and
Privacy Commissioner, Ontario, Canada, June 2005) http://www.ipc.on.ca/index.
asp?navid=46&fid1=259&fid2=2
• Short Notice to the Public under the Personal Health Information Protection Act: Your
Health Information and Your Privacy in Our Facility (Office of the Information and
Privacy Commissioner, Ontario, Canada, June 2005) http://www.ipc.on.ca/index.
asp?navid=46&fid1=261&fid2=2
• Cavoukian, Ann., Weiss, B. Justin, Privacy by Design and User Interfaces: Emerging
Design Criteria - Keep it User-Centric (Office of the Information and Privacy Commissioner,
Ontario, Canada, June 2012), http://www.ipc.on.ca/images/Resources/pbd-
user-interfaces_Yahoo.pdf
• Cavoukian, Ann. 7 Laws of Identity: The Case for Privacy-Embedded Laws of Identity in the
Digital Age (Office of the Information and Privacy Commissioner, Ontario, Canada, October
2006), http://www.ipc.on.ca/images/Resources/up-7laws_whitepaper.pdf
• Green, Shane., Galper, Josh., Reed, Drummond., Brandt, Liz., Mitchell, Alan., Cavoukian,
Ann. Privacy by Design and the Emerging Personal Data Ecosystem, (Office of the
Information and Privacy Commissioner, Ontario, Canada, October 2012) http://www.
ipc.on.ca/images/Resources/pbd-pde.pdf
• Tomko, G. J., Borrett, D. S., Kwan, H. C., & Steffan, G. (2010). SmartData: Make the
data “think” for itself. Identity in the Information Society, 3(2), 343-362
• Cavoukian, A. (2012). Privacy by Design: Origins, Meaning, and Prospects for Assuring
Privacy and Trust in the Information Era Privacy Protection Measures and Technologies
in Business Organizations: Aspects and Standards (pp. 170-208): IGI Global.
- 65 -
Ann Cavoukian, Ph.D.
Information and Privacy Commissioner,
Ontario, Canada
December 2012