0% found this document useful (0 votes)
69 views

PMRM-v1.0-cs02 - Privacy by Design - Template

Privacy by design risk model

Uploaded by

Shivesh Ranjan
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
69 views

PMRM-v1.0-cs02 - Privacy by Design - Template

Privacy by design risk model

Uploaded by

Shivesh Ranjan
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 37

1

2 Privacy Management Reference Model


3 and Methodology (PMRM) Version 1.0
4 Committee Specification 02
5 17 May 2016
6 Specification URIs
7 This version:
8 http://docs.oasis-open.org/pmrm/PMRM/v1.0/cs02/PMRM-v1.0-cs02.pdf (Authoritative)
9 http://docs.oasis-open.org/pmrm/PMRM/v1.0/cs02/PMRM-v1.0-cs02.html
10 http://docs.oasis-open.org/pmrm/PMRM/v1.0/cs02/PMRM-v1.0-cs02.doc
11 Previous version:
12 http://docs.oasis-open.org/pmrm/PMRM/v1.0/cs01/PMRM-v1.0-cs01.pdf (Authoritative)
13 http://docs.oasis-open.org/pmrm/PMRM/v1.0/cs01/PMRM-v1.0-cs01.html
14 http://docs.oasis-open.org/pmrm/PMRM/v1.0/cs01/PMRM-v1.0-cs01.doc
15 Latest version:
16 http://docs.oasis-open.org/pmrm/PMRM/v1.0/PMRM-v1.0.pdf (Authoritative)
17 http://docs.oasis-open.org/pmrm/PMRM/v1.0/PMRM-v1.0.html
18 http://docs.oasis-open.org/pmrm/PMRM/v1.0/PMRM-v1.0.doc
19 Technical Committee:
20 OASIS Privacy Management Reference Model (PMRM) TC
21 Chair:
22 John Sabo ([email protected]) Individual
23 Editors:
24 Michele Drgon, ([email protected]), DataProbity
25 Gail Magnuson ([email protected]), Individual
26 John Sabo ([email protected]), Individual
27 Abstract:
28 The Privacy Management Reference Model and Methodology (PMRM, pronounced “pim-rim”)
29 provides a model and a methodology to
30  understand and analyze privacy policies and their privacy management requirements in
31 defined Use Cases; and
32  select the technical Services, Functions and Mechanisms that must be implemented to
33 support requisite Privacy Controls.
34 It is particularly valuable for Use Cases in which Personal Information (PI) flows across
35 regulatory, policy, jurisdictional, and system boundaries.
36 Status:
37 This document was last revised or approved by the OASIS Privacy Management Reference
38 Model (PMRM) TC on the above date. The level of approval is also listed above. Check the
39 “Latest version” location noted above for possible later revisions of this document. Any other
40 numbered Versions and other technical work produced by the Technical Committee (TC) are
41 listed at https://www.oasis-open.org/committees/tc_home.php?wg_abbrev=pmrm#technical.

1 PMRM-v1.0-cs02 17 May 2016


2 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 1 of 37
42 TC members should send comments on this specification to the TC’s email list. Others should
43 send comments to the TC’s public comment list, after subscribing to it by following the
44 instructions at the “Send A Comment” button on the TC’s web page at https://www.oasis-
45 open.org/committees/pmrm/.
46 For information on whether any patents have been disclosed that may be essential to
47 implementing this specification, and any offers of patent licensing terms, please refer to the
48 Intellectual Property Rights section of the TC’s web page
49 (https://www.oasis-open.org/committees/pmrm/ipr.php).
50 Citation format:
51 When referencing this specification the following citation format should be used:
52 [PMRM-v1.0]
53 Privacy Management Reference Model and Methodology (PMRM) Version 1.0. Edited by Michele
54 Drgon, Gail Magnuson, and John Sabo. 17 May 2016. OASIS Committee Specification 02.
55 http://docs.oasis-open.org/pmrm/PMRM/v1.0/cs02/PMRM-v1.0-cs02.html. Latest version:
56 http://docs.oasis-open.org/pmrm/PMRM/v1.0/PMRM-v1.0.html.

3 PMRM-v1.0-cs02 17 May 2016


4 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 2 of 37
57 Notices
58 Copyright © OASIS Open 2016. All Rights Reserved.
59 All capitalized terms in the following text have the meanings assigned to them in the OASIS Intellectual
60 Property Rights Policy (the "OASIS IPR Policy"). The full Policy may be found at the OASIS website.
61 This document and translations of it may be copied and furnished to others, and derivative works that
62 comment on or otherwise explain it or assist in its implementation may be prepared, copied, published,
63 and distributed, in whole or in part, without restriction of any kind, provided that the above copyright notice
64 and this section are included on all such copies and derivative works. However, this document itself may
65 not be modified in any way, including by removing the copyright notice or references to OASIS, except as
66 needed for the purpose of developing any document or deliverable produced by an OASIS Technical
67 Committee (in which case the rules applicable to copyrights, as set forth in the OASIS IPR Policy, must
68 be followed) or as required to translate it into languages other than English.
69 The limited permissions granted above are perpetual and will not be revoked by OASIS or its successors
70 or assigns.
71 This document and the information contained herein is provided on an "AS IS" basis and OASIS
72 DISCLAIMS ALL WARRANTIES, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO ANY
73 WARRANTY THAT THE USE OF THE INFORMATION HEREIN WILL NOT INFRINGE ANY
74 OWNERSHIP RIGHTS OR ANY IMPLIED WARRANTIES OF MERCHANTABILITY OR FITNESS FOR A
75 PARTICULAR PURPOSE.
76 OASIS requests that any OASIS Party or any other party that believes it has patent claims that would
77 necessarily be infringed by implementations of this OASIS Committee Specification or OASIS Standard,
78 to notify OASIS TC Administrator and provide an indication of its willingness to grant patent licenses to
79 such patent claims in a manner consistent with the IPR Mode of the OASIS Technical Committee that
80 produced this specification.
81 OASIS invites any party to contact the OASIS TC Administrator if it is aware of a claim of ownership of
82 any patent claims that would necessarily be infringed by implementations of this specification by a patent
83 holder that is not willing to provide a license to such patent claims in a manner consistent with the IPR
84 Mode of the OASIS Technical Committee that produced this specification. OASIS may include such
85 claims on its website, but disclaims any obligation to do so.
86 OASIS takes no position regarding the validity or scope of any intellectual property or other rights that
87 might be claimed to pertain to the implementation or use of the technology described in this document or
88 the extent to which any license under such rights might or might not be available; neither does it represent
89 that it has made any effort to identify any such rights. Information on OASIS' procedures with respect to
90 rights in any document or deliverable produced by an OASIS Technical Committee can be found on the
91 OASIS website. Copies of claims of rights made available for publication and any assurances of licenses
92 to be made available, or the result of an attempt made to obtain a general license or permission for the
93 use of such proprietary rights by implementers or users of this OASIS Committee Specification or OASIS
94 Standard, can be obtained from the OASIS TC Administrator. OASIS makes no representation that any
95 information or list of intellectual property rights will at any time be complete, or that any claims in such list
96 are, in fact, Essential Claims.
97 The name "OASIS" is a trademark of OASIS, the owner and developer of this specification, and should be
98 used only to refer to the organization and its official outputs. OASIS welcomes reference to, and
99 implementation and use of, specifications, while reserving the right to enforce its marks against
100 misleading uses. Please see https://www.oasis-open.org/policies-guidelines/trademark for above
101 guidance.

5 PMRM-v1.0-cs02 17 May 2016


6 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 3 of 37
102 Table of Contents
103 1 Introduction......................................................................................................................................... 6
104 1.1 General Introduction to the PMRM.................................................................................................... 6
105 1.2 Major Changes from PMRM V1.0 CS01............................................................................................ 7
106 1.3 Context.............................................................................................................................................. 7
107 1.4 Objectives and Benefits..................................................................................................................... 8
108 1.5 Target Audiences.............................................................................................................................. 9
109 1.6 Specification Summary...................................................................................................................... 9
110 1.7 Terminology..................................................................................................................................... 11
111 1.8 Normative References..................................................................................................................... 12
112 1.9 Non-Normative References............................................................................................................. 12
113 2 Develop Use Case Description and High-Level Privacy Analysis.....................................................13
114 2.1 Application and Business Process Descriptions..............................................................................13
115 Task #1: Use Case Description................................................................................................... 13
116 Task #2: Use Case Inventory....................................................................................................... 14
117 2.2 Applicable Privacy Policies.............................................................................................................. 14
118 Task #3: Privacy Policy Conformance Criteria.............................................................................14
119 2.3 Initial Privacy Impact (or other) Assessment(s) [optional]................................................................15
120 Task #4: Assessment Preparation............................................................................................... 15
121 3 Develop Detailed Privacy Analysis.................................................................................................... 16
122 3.1 Identify Participants and Systems, Domains and Domain Owners, Roles and Responsibilities,
123 Touch Points and Data Flows (Tasks # 5-10)...........................................................................................16
124 Task #5: Identify Participants....................................................................................................... 16
125 Task #6: Identify Systems and Business Processes....................................................................16
126 Task #7: Identify Domains and Owners.......................................................................................17
127 Task #8: Identify Roles and Responsibilities within a Domain.....................................................18
128 Task #9: Identify Touch Points..................................................................................................... 18
129 Task #10: Identify Data Flows........................................................................................................ 18
130 3.2 Identify PI in Use Case Domains and Systems...............................................................................19
131 Task #11: Identify Incoming PI....................................................................................................... 19
132 Task #12: Identify Internally Generated PI.....................................................................................19
133 Task #13: Identify Outgoing PI....................................................................................................... 19
134 3.3 Specify Required Privacy Controls Associated with PI....................................................................19
135 Task #14: Specify Inherited Privacy Controls................................................................................19
136 Task #15: Specify Internal Privacy Controls..................................................................................20
137 Task #16: Specify Exported Privacy Controls................................................................................20
138 4 Identify Services and Functions Necessary to Support Privacy Controls..........................................21
139 4.1 Services and Functions Needed to Implement the Privacy Controls...............................................21
140 4.2 Service Details and Function Descriptions......................................................................................23
141 4.2.1 Core Policy Services................................................................................................................ 23
142 1. Agreement Service................................................................................................................... 23
143 2. Usage Service.......................................................................................................................... 23
144 4.2.2 Privacy Assurance Services..................................................................................................... 23
145 3. Validation Service..................................................................................................................... 23

7 PMRM-v1.0-cs02 17 May 2016


8 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 4 of 37
146 4. Certification Service.................................................................................................................. 24
147 5. Enforcement Service................................................................................................................ 24
148 6. Security Service........................................................................................................................ 24
149 4.2.3 Presentation and Lifecycle Services........................................................................................25
150 7. Interaction Service.................................................................................................................... 25
151 8. Access Service......................................................................................................................... 25
152 4.3 Identify Services satisfying the Privacy Controls.............................................................................25
153 Task #17: Identify the Services and Functions necessary to support operation of identified Privacy
154 Controls 25
155 5 Define Technical and Procedural Mechanisms Supporting Selected Services and Functions..........27
156 5.1 Identify Mechanisms Satisfying the Selected Services and Functions............................................27
157 Task #18: Identify the Mechanisms that Implement the Identified Services and Functions...........27
158 6 Perform Operational Risk and/or Compliance Assessment..............................................................28
159 Task #19: Conduct Risk Assessment............................................................................................ 28
160 7 Initiate Iterative Process.................................................................................................................... 29
161 Task #20: Iterate the analysis and refine.......................................................................................29
162 8 Conformance..................................................................................................................................... 30
163 8.1 Introduction...................................................................................................................................... 30
164 8.2 Conformance Statement.................................................................................................................. 30
165 9 Operational Definitions for Privacy Principles and Glossary.............................................................31
166 9.1 Operational Privacy Principles......................................................................................................... 31
167 9.2 Glossary.......................................................................................................................................... 32
168 9.3 PMRM Acronyms............................................................................................................................ 36
169 Appendix A. Acknowledgments............................................................................................................ 37
170

9 PMRM-v1.0-cs02 17 May 2016


10 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 5 of 37
171 1 Introduction
172 1.1 General Introduction to the PMRM
173 The Privacy Management Reference Model and Methodology (PMRM) addresses the reality of today’s
174 networked, interoperable systems, applications and devices coupled with the complexity of managing
175 Personal Information (PI)1 across legal, regulatory and policy environments in these interconnected
176 Domains. It can be of great value both to business and program managers who need to understand the
177 implications of Privacy Policies for specific business systems and to assess privacy management risks as
178 well as to developers and engineers who are tasked with building privacy into Systems and Business
179 Processes.
180 Additionally, the PMRM is a valuable tool to achieve Privacy by Design, particularly for those seeking to
181 improve privacy management, compliance and accountability in complex, integrated information systems
182 and solutions - such as health IT, financial services, federated identity, social networks, smart grid, mobile
183 apps, cloud computing, Big Data, Internet of Things (IoT), etc. Achieving Privacy by Design is challenging
184 enough in relatively simple systems, but can present insurmountable challenges in the complex systems
185 we see today, where the use of PI across the entire ecosystem is governed by a web of laws, regulations,
186 business contracts, operational policies and technologies.
187 The PMRM is neither a static model nor a purely prescriptive set of rules (although it includes
188 characteristics of both). It utilizes the development of a Use Case that is clearly bounded, and which
189 forms the basis for a Privacy Management Analysis (PMA). Implementers have flexibility in determining
190 the level and granularity of analysis required for their particular Use Case.
191 A Use Case can be scoped narrowly or broadly. Although its granular-applicability is perhaps most useful
192 to practitioners, it can also be employed at a broader level, encompassing an entire enterprise, product
193 line or common set of functions within a company or government agency. From such a comprehensive
194 level, the privacy office could establish broad Privacy Controls, implemented by Services and their
195 underlying Functionality in manual and technical Mechanisms – and these, in turn, would produce a high
196 level PMA and could also inform a high-level Privacy Architecture. Both the PMA and a Privacy
197 Architecture could then be used to incorporate these reusable Services, Functions and Mechanisms in
198 future initiatives, enabling improved risk assessment, compliance and accountability.
199 In order to ensure Privacy by Design at the granular level, a Use Case will more likely be scoped for a
200 specific design initiative. However, the benefit of having used the PMRM at the broadest level first is to
201 inform more-granular initiatives with guidance from an enterprise perspective, potentially reducing the
202 amount of work for the privacy office and engineers.
203 Even if the development of an overarching PMA is not appropriate for an organization, the PMRM will be
204 useful in fostering interoperable policies and policy management standards and solutions. In this way, the
205 PMRM further enables Privacy by Design because of its analytic structure and primarily operational focus.
206 A PMRM-generated PMA, because of its clear structure and defined components, can be valuable as a
207 tool to inform the development of similar applications or systems that use PI.
208 As noted in Section 8, the PMRM as a “model” is abstract. However, as a Methodology it is through the
209 process of developing a detailed Use Case and a PMA that important levels of detail emerge, enabling a
210 complete picture of how privacy risks and privacy requirements are being managed. As a Methodology
211 the PMRM – richly detailed and having multiple, iterative task levels - is intentionally open-ended and can
212 help users build PMAs at whatever level of complexity they require.

1
11 Note: We understand the important distinction between ‘Personal Information’ (PI) and ‘Personally-Identifiable
12 Information’ (PII) and that in specific contexts a clear distinction must be made explicitly between the two, which
13 should be reflected as necessary by users of the PMRM. However, for the purposes of this document, the term ‘PI’
14 will be used as an umbrella term to simplify the specification. Section 9.2 Glossary addresses the distinctions
15 between PI and PII.
16
17 PMRM-v1.0-cs02 17 May 2016
18 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 6 of 37
213
214 Note: It is strongly recommended that Section 9 Operational Definitions for Privacy Principles and
215 Glossary is read before proceeding. The Operational Privacy Principles and the Glossary are key to a
216 solid understanding of Sections 2 through 8.

217 1.2 Major Changes from PMRM V1.0 CS01


218 This version of the PMRM incorporates a number of changes that are intended to clarify the PMRM
219 methodology, resolve inconsistencies in the text, address the increased focus on accountability by privacy
220 regulators, improve definitions of terms, expand the Glossary, improve the graphical figures used to
221 illustrate the PMRM, and add references to the OASIS Privacy by Design Documentation for Software
222 Engineers committee specification. Although the PMRM specification has not fundamentally changed, the
223 PMRM technical committee believes the changes in this version will increase the clarity of the PMRM and
224 improve its usability and adoption by stakeholders who are concerned about operational privacy,
225 compliance and accountability.

226 1.3 Context


227 Predictable and trusted privacy management must function within a complex, inter-connected set of
228 networks, Business Processes, Systems, applications, devices, data, and associated governing policies.
229 Such a privacy management capability is needed in traditional computing, Business Process engineering,
230 in cloud computing capability delivery environments and in emerging IoT environments.
231 An effective privacy management capability must be able to instantiate the relationship between PI and
232 associated privacy policies. The PMRM supports this by producing a PMA, mapping Policy to Privacy
233 Controls to Services and Functions, which in turn are implemented via Mechanisms, both technical and
234 procedural. The PMA becomes the input to the next iteration of the Use Case and informs other initiatives
235 so that the privacy office and engineers are able to apply the output of the PMRM analysis to other
236 applications to shorten their design cycles.
237 The main types of Policy covered in this specification are expressed as classes of Privacy Controls:
238 Inherited, Internal or Exported. The Privacy Controls must be expressed with sufficient granularity as to
239 enable the design of Services consisting of Functions, instantiated through implementing Mechanisms
240 throughout the lifecycle of the PI. Services must accommodate a changing mix of PI and policies,
241 whether inherited or communicated to and from external Domains, or imposed internally. The PMRM
242 methodology makes possible a detailed, structured analysis of the business or application environment,
243 creating a custom PMA for the particular Use Case.
244 A clear strength of the PMRM is its recognition that today’s systems and applications span jurisdictions
245 that have inconsistent and conflicting laws, regulations, business practices, and consumer preferences.
246 This creates huge challenges to privacy management and compliance. It is unlikely that these challenges
247 will diminish in any significant way, especially in the face of rapid technological change and innovation
248 and differing social and national values, norms and policy interests.

249 It is also important to note that in this environment agreements may not be enforceable in certain
250 jurisdictions. And a dispute over jurisdiction may have significant bearing over what rights and duties the
251 participants have regarding use and protection of PI. Even the definition of PI will vary. The PMRM may
252 be useful in addressing these issues. Because data can in many cases easily migrate across
253 jurisdictional boundaries, rights cannot necessarily be protected without explicit specification of what
254 boundaries apply. Proper use of the PMRM will however expose the realities of such environments
255 together with any rules, policies and solutions in place to address them.

256 1.4 Objectives and Benefits


257 The PMRM’s primary objectives are to enable the analysis of complex Use Cases, to understand and
258 design appropriate operational privacy management Services and their underlying Functionality, to
259 implement this Functionality in Mechanisms and to achieve compliance across Domains, systems, and
260 ownership and policy boundaries. A PMRM-derived PMA may also be useful as a tool to inform policy

19 PMRM-v1.0-cs02 17 May 2016


20 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 7 of 37
261 development applicable to multiple Domains, resulting in Privacy Controls, Services and Functions,
262 implementing Mechanisms and – potentially - a Privacy Architecture.
263 Note: Unless otherwise indicated specifically or by context, the use of the term ‘policy’ or ‘policies’ in this
264 document may be understood as referencing laws, regulations, contractual terms and conditions, or
265 operational policies associated with the collection, use, transmission, sharing, cross-border transfers,
266 storage or disposition of personal information or personally identifiable information.
267 While serving as an analytic tool, the PMRM also supports the design of a Privacy Architecture (PA) in
268 response to Use Cases and, as appropriate, for a particular operational environment. It also supports the
269 selection of integrated Services, their underlying Functionality and implementation Mechanisms that are
270 capable of executing Privacy Controls with predictability and assurance. Such an integrated view is
271 important, because business and policy drivers are now both more global and more complex and must
272 thus interact with many loosely coupled systems.
273 The PMRM therefore provides policymakers, the privacy office, privacy engineers, program and business
274 managers, system architects and developers with a tool to improve privacy management and compliance
275 in multiple jurisdictional contexts while also supporting delivery and business objectives. In this Model, the
276 Services associated with privacy (including Security) will be flexible, configurable and scalable and make
277 use of technical Functionality, Business Process and policy components. These characteristics require a
278 specification that is policy-configurable, since there is no uniform, internationally adopted privacy
279 terminology and taxonomy.
280 Analysis and documentation produced using the PMRM will result in a PMA that serves multiple
281 Stakeholders, including privacy officers and managers, general compliance managers, system developers
282 and even regulators in a detailed, comprehensive and integrated manner. The PMRM creates an audit
283 trail from Policy to Privacy Controls to Services and Functions to Mechanisms. This is a key difference
284 between the PMRM and a PIA.
285 There is an additional benefit. While other privacy instruments such as PIAs also serve multiple
286 Stakeholders, the PMRM does so in a way that is different from these others. Such instruments, while
287 nominally of interest to multiple Stakeholders, tend to serve particular groups. For example, PIAs are
288 often of most direct concern to privacy officers and managers, even though developers are often tasked
289 with contributing to them. Such privacy instruments also tend to change hands on a regular basis. As an
290 example, a PIA may start out in the hands of the development or project team, move to the privacy or
291 general compliance function for review and comment, go back to the project for revision, move back to
292 the privacy function for review, and so on. This iterative process of successive handoffs is valuable, but
293 can easily devolve into a challenge and response dynamic that can itself lead to miscommunication and
294 misunderstandings. Typically PIA’s do not trace compliance from Policies to Privacy Controls to Services
295 and Functions on to Mechanisms. Nor are they performed at a granular level.
296 In contrast, the resulting output of using the PMRM - the PMA - will have direct and ongoing relevance for
297 all Stakeholders and is less likely to suffer the above dynamic. This is because the PMA supports
298 productive interaction and collaboration among multiple communities. Although the PMA is fully and
299 continuously a part of each relevant community, each community draws its own meanings from it, based
300 on their needs and perspectives. As long as these meanings are not inconsistent across communities, the
301 PMA can act as a shared, yet heterogeneous, understanding. Thus, the PMA is accessible and relevant
302 to all Stakeholders, facilitating collaboration across relevant communities in a way that other privacy
303 instruments often cannot.
304 This multiple stakeholder capability is especially important today, given the growing recognition that
305 Privacy by Design principles and practices cannot be adopted effectively without a common, structured
306 protocol that enables the linkage of business requirements, policies, and technical implementations.
307 Finally, the PMA can also serve as an important artifact of accountability, in two ways. First, a rigorously
308 developed and documented PMA itself reveals all aspects of privacy management within a Domain or
309 Use Case, making clear the relationship between the Privacy Services, Functionality and Mechanisms in
310 place and their associated Privacy Controls and Policies. Second, in addition to proactively
311 demonstrating that Privacy Controls are in place and implemented via the PMA, the Services may also
312 include functionality that demonstrates accountability at a granular level. Such Functionality implemented
313 in Mechanisms confirms and reports that the Privacy Controls are correctly operating. Thus the privacy
314 office can demonstrate compliance on demand for both design and operational stages.

21 PMRM-v1.0-cs02 17 May 2016


22 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 8 of 37
315 1.5 Target Audiences
316 The intended audiences of this document and expected benefits to be realized by each include:
317  Privacy and Risk Officers and Engineers will gain a better understanding of the specific privacy
318 management environment for which they have compliance responsibilities as well as detailed policy
319 and operational processes and technical systems that are needed to achieve their organization’s
320 privacy compliance objectives..
321  Systems/Business Architects will have a series of templates for the rapid development of core
322 systems functionality, developed using the PMRM as a tool.
323  Software and Service Developers will be able to identify what processes and methods are required
324 to ensure that PI is collected, stored, used, shared, transmitted, transferred across-borders, retained
325 or disposed in accordance with requisite privacy control requirements.
326  Public policy makers and business owners will be able to identify any weaknesses or
327 shortcomings of current policies and use the PMRM to establish best practice guidelines where
328 needed. They will also have stronger assurance that the design of business systems and
329 applications, as well as their operational implementations, comply with privacy control requirements.

330 1.6 Specification Summary


331 The PMRM consists of:
332  A conceptual model of privacy management, including definitions of terms;
333  A methodology; and
334  A set of operational Services and Functions, together with the inter-relationships among these three
335 elements.
336 The PMRM, as a conceptual model, addresses all Stakeholder-generated requirements, and is
337 anchored in the principles of Service-Oriented Architecture. It recognizes the value of services operating
338 across departments, systems and Domain boundaries. Given the reliance by the privacy policy
339 community (often because of regulatory mandates in different jurisdictions) on what on inconsistent, non-
340 standardized definitions of fundamental Privacy Principles, the PMRM includes a non-normative, working
341 set of Operational Privacy Principle definitions (see section 9.1). These definitions may be useful to
342 provide insight into the Model. With their operational focus, these working definitions are not intended to
343 supplant or to in any way suggest a bias for or against any specific policy or policy set. However, they
344 may prove valuable as a tool to help deal with the inherent biases built into current terminology
345 associated with privacy by abstracting specific operational features and assisting in their categorization.
346 In Figure 1 below we see that the core concern of privacy protection and management, is expressed by
347 Stakeholders (including data subjects, policy makers, solution providers, etc.) who help, on the one hand,
348 drive policies (which both reflect and influence actual regulation and lawmaking), and on the other hand,
349 inform the Use Cases that are developed to expose and document specific Privacy Control requirements
350 and the Services and Functions necessary to implement them in Mechanisms.

23 PMRM-v1.0-cs02 17 May 2016


24 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 9 of 37
351

352
353 Figure 1 – The PMRM Model - Achieving Comprehensive Operational Privacy

354
355 The PMRM, as a methodology covers a series of tasks, outlined in the following sections of the
356 document, concerned with:
357  defining and describing the scope of the Use Cases, either broad or narrow;
358  identifying particular business Domains and understanding the roles played by all participants and
359 systems within the Domains in relation to privacy policies;
360  identifying the data flows and Touch Points for all personal information within a Domain or Domains;
361  specifying various Privacy Controls;
362  identifying the Domains through which PI flows and which require the implementation of Privacy
363 Controls;
364  mapping Domains to the Services and Functions and then to technical and procedural Mechanisms;
365  performing risk and compliance assessments;
366  documenting the PMA for future iterations of this application of the PMRM, for reuse in other
367 applications of the PMRM, and, potentially, to inform a Privacy Architecture.
368 The specification defines a set of Services and Functions deemed necessary to implement the
369 management and compliance of detailed privacy policies and Privacy Controls within a particular Use
370 Case. The Services are sets of Functions, which form an organizing foundation to facilitate the
371 application of the model and to support the identification of the specific Mechanisms, which will implement
372 them. They may optionally be incorporated in a broader Privacy Architecture.
373 The set of operational Services (Agreement, Usage, Validation, Certification, Enforcement, Security,
374 Interaction, and Access) is described in Section 4 below and in the Glossary in section 9.2.
375 The core of this specification is expressed in three major sections: Section 2, “Develop Use Case
376 Description and High-Level Privacy Analysis,” Section 3, “Develop Detailed Privacy Analysis,” and
377 Section 4, “Identify Services and Functions Necessary to Support Privacy Controls.” The detailed analysis
378 is informed by the general findings associated with the high level analysis. However, it is much more
379 granular and requires documentation and development of a Use Case which clearly expresses the
380 complete application and/or business environment within which personal information is collected, stored,
381 used, shared, transmitted, transferred across-borders, retained or disposed.

25 PMRM-v1.0-cs02 17 May 2016


26 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 10 of 37
382 It is important to point out that the model is not generally prescriptive and that users of the PMRM may
383 choose to adopt some parts of the model and not others. They may also address the tasks in a different
384 order, appropriate to the context or to allow iteration and discovery of further requirements as work
385 proceeds. Obviously, a complete use of the model will contribute to a more comprehensive PMA. As
386 such, the PMRM may serve as the basis for the development of privacy-focused capability maturity
387 models and improved compliance frameworks. As mentioned above, the PMRM may also provide a
388 foundation on which to build Privacy Architectures.
389 Again, the use of the PMRM, for a particular business Use Case will lead to the production of a PMA. An
390 organization may have one or more PMAs, particularly across different business units, or it may have a
391 unified PMA. Theoretically, a PMA may apply across organizations, states, and even countries or other
392 geo-political boundaries.
393 Figure 2 below shows the high-level view of the PMRM methodology that is used to create a PMA.
394 Although the stages are sequenced for clarity, no step is an absolute pre-requisite for starting work on
395 another step and the overall process will usually be iterative. Equally, the process of conducting an
396 appropriate PMA, and determining how and when implementation will be carried out, may be started at
397 any stage during the overall process.

398
399 Figure 2 - The PMRM Methodology

400 1.7 Terminology


401 References are surrounded with [square brackets] and are in bold text.
402 The key words “MUST”, “MUST NOT”, “REQUIRED”, “SHALL”, “SHALL NOT”, “SHOULD”, “SHOULD
403 NOT”, “RECOMMENDED”, “MAY”, and “OPTIONAL” in this document are to be interpreted as described
404 in [RFC2119].

27 PMRM-v1.0-cs02 17 May 2016


28 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 11 of 37
405 A glossary of key terms used in this specification as well as non-normative definitions for Operational
406 Privacy Principles are included in Section 9 of the document.
407 We note that words and terms used in the discipline of data privacy in many cases have meanings and
408 inferences associated with specific laws, regulatory language, and common usage within privacy
409 communities. The use of such well-established terms in this specification is unavoidable. However, we
410 urge readers to consult the definitions in the Glossary and clarifications in the text to reduce confusion
411 about the use of such terms within this specification. Readers should also be aware that terms used in the
412 different examples are sometimes more “conversational” than in the formal, normative sections of the text
413 and may not necessarily be defined in the Glossary.

414 1.8 Normative References


415 [RFC2119] S. Bradner, Key words for use in RFCs to Indicate Requirement Levels,
416 http://www.ietf.org/rfc/rfc2119.txt, IETF RFC 2119, March 1997.

417 1.9 Non-Normative References


418 [SOA-RM] OASIS Standard, "Reference Model for Service Oriented Architecture 1.0”, 12
419 October 2006. http://docs.oasis-open.org/soa-rm/v1.0/soa-rm.pdf
420 [SOA-RAF] OASIS Specification, “Reference Architecture Foundation for SOA v1.0”,
421 November 2012. http://docs.oasis-open.org/soa-rm/soa-ra/v1.0/cs01/soa-ra-v1.0-
422 cs01.pdf
423 [PBD-SE] OASIS Committee Specification, “Privacy by Design Documentation for Software
424 Engineers Version 1.0.”
425 http://docs.oasis-open.org/pbd-se/pbd-se/v1.0/csd01/pbd-se-v1.0-csd01.pdf
426 [NIST 800-53] NIST Special Publication 800-53 “Security and Privacy Controls for Federal
427 Information Systems and Organizations” Rev 4 (01-22-2015) – Appendix J:
428 Privacy Controls Catalog.
429 http://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-53r4.pdf
430 [ISTPA-OPER] International Security Trust and Privacy Alliance (ISTPA) publication, “Analysis of
431 Privacy Principles: Making Privacy Operational,” v2.0 (2007). https://www.oasis-
432 open.org/committees/download.php/55945/ISTPAAnalysisofPrivacyPrinciplesV2.
433 pdf

29 PMRM-v1.0-cs02 17 May 2016


30 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 12 of 37
434 2 Develop Use Case Description and High-Level
435 Privacy Analysis
436 The first phase in applying the PMRM methodology requires the scoping of the Use Case in which PI is
437 associated - in effect, identifying the complete description in which the environment, application or
438 capabilities where privacy and data protection requirements are applicable. The extent of the scoping
439 analysis and the definitions of “business environment” or “application” are set by the Stakeholders using
440 the PMRM within a particular Use Case. These may be defined broadly or narrowly, and may include
441 lifecycle (time) elements.
442 The high level analysis may also make use of Privacy Impact Assessments, previous risk assessments,
443 privacy maturity assessments, compliance reviews, and accountability model assessments as determined
444 by Domain Stakeholders. However, the scope of the high level privacy analysis (including all aspects of
445 the business environment or application under review and all relevant privacy policies) must correspond
446 with the scope of analysis covered in Section 3, “Develop Detailed Privacy Use Case Analysis,” below.
447 Note, that the examples below refer to a detailed Use Case. The same methodology and model can be
448 used at more abstract levels. Using the PMRM to study an entire business environment to develop
449 Policies, Privacy Controls, Services and Functions, Mechanisms, a PMA and perhaps a Privacy
450 Architecture allows an entity to establish broad guidance for use in future application of the PMRM in
451 another, more-detailed Use Case.

452 2.1 Application and Business Process Descriptions


453 Task #1: Use Case Description
454 Objective Provide a general description of the Use Case

455 Task 1 Example2


456 A California electricity supplier (Utility), with a residential customer base with smart meters installed in
457 homes, offers-reduced electricity rates for evening recharging of vehicles’ batteries. The utility also
458 permits the customer to use the charging station at another customer’s site [such as at a friend’s house]
459 and have the system bill the vehicle owner instead of the customer whose charging station is used.
460 Utility customers register with the utility to enable electric vehicle (EV) charging. An EV Customer
461 (Customer One) plugs in the car at her residence, and the system detects the connection. The utility
462 system is aware of the car’s location, its registered ID number and the approximate charge required
463 (estimated by the car’s onboard computer). Based on Customer One’s preferences, the utility
464 schedules the recharge to take place during the evening hours and at times determined by the utility
465 (for load balancing).
466 The billing department system calculates the amount of money to charge Customer One, based on EV
467 rates, time of charging, and duration of the charge.
468 The following week, Customer One drives to a friend’s home (Customer Two) and needs a quick
469 charge of her vehicle’s battery. When she plugs her EV into Customer Two’s EV charger, the utility
470 system detects Customer Two’s location, vehicle ID number, the fact that the EV is using Customer
471 Two’s system, the date and time, Customer One’s preferences and other operational information...
472 The billing department system calculates the invoice amount to bill the EV Customer One, based on
473 Customer One’s account information and preferences.
474 The utility has a privacy policy that incudes selectable options for customers relating to the use of PI
475 associated with location and billing information, and has implemented systems to enforce those
476 policies.

31 2
The boxed examples are not to be considered as part of the normative text of this document.
32 PMRM-v1.0-cs02 17 May 2016
33 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 13 of 37
477 Task #2: Use Case Inventory
478 Objective Provide an inventory of the business environment, capabilities, applications and policy
479 environment under review at the level of granularity appropriate for the analysis covered
480 by the PMRM and define a High Level Use Case, which will guide subsequent analysis.
481 In order to facilitate the analysis described in the Detailed Privacy Use Case Analysis in
482 Section 3, the components of this Use Case inventory should align as closely as possible
483 with the components that will be analyzed in the corresponding Detailed Privacy Use
484 Case Analysis in Section 4.
485 Note The inventory can include organizational structures, applications and Business
486 Processes; products; policy environment; legal and regulatory jurisdictions; Systems
487 supporting the capabilities and applications; PI; time; and other factors impacting the
488 collection, storage, usage, sharing, transmitting, transferred across-borders, retained or
489 disposed of PI. The inventory should also include the types of data subjects covered by
490 the Use Case together with specific privacy options (such as policy preferences, privacy
491 settings, etc. if these are formally expressed) for each type of data subject.

492 Task 2 Example


493 Systems: Utility Communications Network, Customer Billing System, EV On Board System…
494 Legal and Regulatory Jurisdictions:
495 California Constitution, Article 1, section 1 gives each citizen an "inalienable right" to
496 pursue and obtain "privacy."
497 Office of Privacy Protection - California Government Code section 11549.5.
498 Automobile Black Boxes" - Vehicle Code section 9951.
499 …
500 Personal Information Collected on Internet:
501 Government Code section 11015.5. This law applies to state government agencies…
502 The California Public Utilities Commission, which “serves the public interest by protecting
503 consumers and ensuring the provision of safe, reliable utility service and infrastructure at
504 reasonable rates, with a commitment to environmental enhancement and a healthy
505 California economy”…
506 Utility Policy: The Utility has a published Privacy Policy covering the EV recharging/billing application
507 Customer: The customer’s selected settings for policy options presented via customer-facing
508 interfaces.

509 2.2 Applicable Privacy Policies


510 Task #3: Privacy Policy Conformance Criteria
511 Objective Define and describe the criteria for conformance of the organization or a System or
512 Business Process (identified in the Use Case and inventory) with an applicable Privacy
513 Policy or policies. As with the inventory described in Task #2 above, the conformance
514 criteria should align with the equivalent elements in the Detailed Use Case Analysis
515 described in Section 3. Wherever possible, they should be grouped by the relevant
516 Operational Privacy Principles and required Privacy Controls.
517 Note Whereas Task #2 itemizes the environmental elements relevant to the Use Case, Task #
518 3 focuses on the privacy requirements specifically.
519 Task 3 Example
520 Privacy Policy Conformance Criteria:
521 (1) Ensure that the utility does not share PI with third parties without the customer’s consent…etc. For
522 example a customer may choose to not share their charging location patterns

34 PMRM-v1.0-cs02 17 May 2016


35 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 14 of 37
523 (2) Ensure that the utility supports strong levels of:
524 (a) Identity authentication
525 (b) Security of transmission between the charging stations and the utility information systems…etc.
526 (3) Ensure that PI is deleted on expiration of retention periods…

527 2.3 Initial Privacy Impact (or other) Assessment(s) [optional]


528 Task #4: Assessment Preparation
529 Objective Include, or prepare, an initial Privacy Impact Assessment, or as appropriate, a risk
530 assessment, privacy maturity assessment, compliance review, or accountability model
531 assessment applicable to the Use Case. Such an assessment can be deferred until a
532 later iteration step (see Section 7) or inherited from a previous exercise.
533 Task 4 Example
534 Since the EV has a unique ID, it can be linked to a specific customer. As such, customer’s whereabouts
535 may be revealed and tracked through utility transaction’s systems.
536 The EV charging and vehicle management systems may retain data, which can be used to identify
537 charging time and location information that can constitute PI (including driving patterns).
538 Unless safeguards are in place and (where appropriate) under the customer’s control, there is a danger
539 that intentionally anonymized PI nonetheless becomes PII.
540 The utility may build systems to capture behavioral and movement patterns and sell this information to
541 potential advertisers or other information brokers to generate additional revenue. The collection and
542 use of such information requires the explicit, informed consent of the customer.

36 PMRM-v1.0-cs02 17 May 2016


37 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 15 of 37
543 3 Develop Detailed Privacy Analysis
544 Goal Prepare and document a detailed PMA of the Use Case, which corresponds with the
545 High Level Privacy Analysis and the High Level Use Case Description.
546 The Detailed Use Case must be clearly bounded and must include the components in the
547 following sections.

548 3.1 Identify Participants and Systems, Domains and Domain Owners,
549 Roles and Responsibilities, Touch Points and Data Flows ( Tasks # 5-
550 10)

551 Task #5: Identify Participants


552 Objective Identify Participants having operational privacy responsibilities.
553 A Participant is any Stakeholder responsible for collecting, storing, using, sharing,
554 transmitting, transferring across-borders, retaining or disposing PI, or is involved in the
555 lifecycle of PI managed by a Domain, or a System or Business Process within a Domain.
556
557 Task 5 Example
558 Participants Located at the Customer Site:
559 Registered Customers (Customers One and Two)
560 Participants Located at the EV’s Location:
561 Registered Customer Host (Customer Two - Temporary host for EV charging), Customer One -
562 Registered Customer Guest
563 Participants Located within the Utility’s Domain:
564 Service Provider (Utility)
565 Contractors and Suppliers to the Utility

566 Task #6: Identify Systems and Business Processes


567 Objective Identify the Systems and Business Processes where PI is collected, stored, used,
568 shared, transmitted, transferred across-borders, retained or disposed within a Domain.
569 Definition For purposes of this specification, a System or Business Process is a collection of
570 components organized to accomplish a specific function or set of functions having a
571 relationship to operational privacy management.
572 Task 6 Example
573 System Located at the Customer Site(s):
574 Customer Communication Portal
575 EV Physical Re-Charging and Metering System
576 System Located in the EV(s):
577 EV: Device
578 EV On-Board System
579 System Located within the EV Manufacturer’s Domain:
580 EV Charging Data Storage and Analysis System
581 System Located within the Utility’s Domain:
38 PMRM-v1.0-cs02 17 May 2016
39 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 16 of 37
582 EV Program Information System (includes Rates, Customer Charge Orders, Customers enrolled
583 in the program, Usage Info etc.)
584 EV Load Scheduler System
585 Utility Billing System
586 Remote Charge Monitoring System
587 Selection System for selecting and transferring PI to the third party

588 Task #7: Identify Domains and Owners


589 Objective Identify the Domains included in the Use Case definition together with the respective
590 Domain Owners.
591 Definition A Domain includes both physical areas (such as a customer site or home, a customer
592 service center, a third party service provider) and logical areas (such as a wide-area
593 network or cloud computing environment) that are subject to the control of a particular
594 Domain owner.
595 A Domain Owner is the Participant responsible for ensuring that Privacy Controls are
596 implemented in Services and Functions within a given Domain.
597 Note Domains may be under the control of Data Subjects or Participants with a specific
598 responsibility for privacy management within a Domain, such as data controllers;
599 capability providers; data processors; and other distinct entities having defined
600 operational privacy management responsibilities. Domains can be “nested” within wider,
601 hierarchically-structured Domains, which may have their own defined ownership, roles
602 and responsibilities. Individual data subjects may also have Doman Owner characteristics
603 and obligations depending on the specific Use Case.
604 Domain Owner identification is important for purposes of establishing accountability.
605 Task 7 Example
606 Utility Domain:
607 The physical premises, located at…. which includes the Utility’s program information system, load
608 scheduling system, billing system, remote monitoring system and the selection system
609 This physical location is part of a larger logical privacy Domain, owned by the Utility and extends
610 to the Customer Portal Communication system at the Customer’s site, and the EV On-Board
611 Metering software application System installed in the EV by the Utility, together with cloud-based
612 services hosted by….
613 Customer Domain:
614 The physical extent of the customer’s home and associated property as well as the EV, wherever
615 located, together with the logical area covered by devices under the ownership and control of the
616 customer (such as mobile devices).
617 Vehicle Domain:
618 The Vehicle Management System, installed in the EV by the manufacturer.
619 Ownership
620 The Systems listed above as part of the Utility’s Systems belong to the Utility Domain Owner
621
622 The EV Vehicle Management System belongs to the Customer Domain Owner but is controlled
623 by the Vehicle Manufacturer
624 The EV (with its ID Number) belongs to the Customer Domain Owner and the Vehicle
625 Manufacturer Domain Owners, but the EV ID may be accessed by the Utility.

40 PMRM-v1.0-cs02 17 May 2016


41 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 17 of 37
626 Task #8: Identify Roles and Responsibilities within a Domain
627 Objective For any given Use Case, identify the roles and responsibilities assigned to specific
628 Participants, Business Processes and Systems within a specific Domain
629 Note Any Participant may carry multiple roles and responsibilities and these need to be
630 distinguishable, particularly as many functions involved in processing of PI are assigned
631 to functional roles, with explicit authority to act, rather than to a specific Participant.
632 Task 8 Example
633 Role: EV Manufacturer Privacy Officer
634 Responsibilities: Ensure that all PI data flows from EV On-Board System that communicate with or
635 utilize the Vehicle Management System conform with contractual obligations
636 associated with the Utility and vehicle owner as well as the Collection Limitation and
637 Information Minimization privacy policies.
638 Role: Utility Privacy Officer
639 Responsibilities Ensure that the PI data flows shared with the Third Party Marketing Domain are
640 done so according to the customer’s permissions and that the Third Party
641 demonstrates the capability to enforce agreed upon privacy management obligations

642 Task #9: Identify Touch Points


643 Objective Identify the Touch Points at which the data flows intersect with Domains or Systems or
644 Business Processes within Domains.
645 Definition Touch Points are the intersections of data flows across Domains or Systems or
646 Processes within Domains.
647 Note The main purpose for identifying Touch Points in the Use Case is to clarify the data flows
648 and ensure a complete picture of all Domains and Systems and Business Processes in
649 which PI is used.
650 Task 9 Example
651 The Customer Communication Portal provides an interface through which the Customer communicates
652 a charge order to the Utility. This interface is a touch point.
653 When Customer One plugs her EV into the charging station, the EV On-Board System embeds
654 communication functionality to send EV ID and EV Charge Requirements to the Customer
655 Communication Portal. This functionality provides a further touch point.

656 Task #10: Identify Data Flows


657 Objective Identify the data flows carrying PI and Privacy Controls among Domains within the Use
658 Case.
659 Data flows may be multidirectional or unidirectional.
660 Task 10 Example
661 When a charging request event occurs, the Customer Communication Portal sends Customer
662 information, EV identification, and Customer Communication Portal location information to the EV
663 Program Information System managed by the Utility.
664 This Program Information System application uses metadata tags to indicate whether or not customer’s
665 identification and location data may be shared with authorized third parties, and to prohibit the sharing
666 of data that provides customers’ movement history, if derived from an aggregation of transactions.

42 PMRM-v1.0-cs02 17 May 2016


43 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 18 of 37
667 3.2 Identify PI in Use Case Domains and Systems
668 Objective Specify the PI collected, stored, used, shared, transmitted, transferred across-borders,
669 retained or disposed within Domains or Systems or Business Processes in three
670 categories, (Incoming, Internally-Generated and Outgoing)

671 Task #11: Identify Incoming PI


672 Definition Incoming PI is PI flowing into a Domain, or a System or Business Process within a
673 Domain.
674 Note Incoming PI may be defined at whatever level of granularity appropriate for the scope of
675 analysis of the Use Case and its Privacy Policies and requirements.

676 Task #12: Identify Internally Generated PI


677 Definition Internally Generated PI is PI created within the Domain or System or Business Process
678 itself.
679 Note Internally Generated PI may be defined at whatever level of granularity appropriate for
680 the scope of analysis of the Use Case and its Privacy Policies and requirements.
681 Examples include device information, time-stamps, location information, and other
682 system-generated data that may be linked to an identity.

683 Task #13: Identify Outgoing PI


684 Definition Outgoing PI is PI flowing from one System to another, or from one Business Process to
685 another, either within a Domain or to another Domain.
686 Note: Outgoing PI may be defined at whatever level of granularity appropriate for the
687 scope of analysis of the Use Case and its Privacy Policies and requirements.
688 Tasks 11, 12, 13 Example
689 Incoming PI:
690 Customer ID received by Customer Communications Portal
691 Internally Generated PI:
692 Current EV location associated with customer information, and time/location information logged
693 by EV On-Board system
694 Outgoing PI:
695 Current EV ID and location information transmitted to Utility Load Scheduler System

696 3.3 Specify Required Privacy Controls Associated with PI


697 Goal For Incoming, Internally Generated and Outgoing PI, specify the Privacy Controls
698 required to enforce the privacy policy associated with the PI. Privacy controls may be pre-
699 defined or may be derived.
700 Definition Control is a process designed to provide reasonable assurance regarding the
701 achievement of stated objectives.
702 Definition Privacy Controls are administrative, technical and physical requirements employed within
703 an organization or Domain in order to protect and manage PI. They express how privacy
704 policies must be satisfied in an operational setting.

705 Task #14: Specify Inherited Privacy Controls


706 Objective Specify the required Privacy Controls that are inherited from Domains or Systems or
707 Processes.

44 PMRM-v1.0-cs02 17 May 2016


45 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 19 of 37
708 Task 14 Example:
709 The utility inherits a Privacy Control associated with the Electric Vehicle’s ID (EVID) from the vehicle
710 manufacturer’s privacy policies.
711 The utility inherits Customer One’s Operational Privacy Control Requirements, expressed as privacy
712 preferences, via a link with the customer communications portal when she plugs her EV into Customer
713 Two’s charging station.
714 The utility must apply Customer One’s privacy preferences to the current transaction. The Utility
715 accesses Customer One’s privacy preferences and learns that Customer One does not want her
716 association with Customer Two exported to the Utility’s third party partners. Even though Customer
717 Two’s privacy settings differ regarding his own PI, Customer One’s non-consent to the association
718 being transmitted out of the Utility’s privacy Domain is sufficient to prevent commutative association.
719 Similarly, if Customer Two were to charge his car’s batteries at Customer One’s location, the
720 association between them would also not be shared with third parties.

721 Task #15: Specify Internal Privacy Controls


722 Objective Specify the Privacy Controls that are mandated by internal Domain Policies.
723 Task 15 Example
724 Use Limitation Internal Privacy Controls
725 The Utility has adopted and complies with California Code SB 1476 of 2010 (Public Utilities Code §§
726 8380-8381 Use Limitation).
727 It also implements the 2011 California Public Utility Commission (CPUC) privacy rules, recognizing the
728 CPUC’s regulatory privacy jurisdiction over it and third parties with which it shares customer data.
729 Further, it adopts NIST 800-53 Appendix J’s “Control Family” on Use Limitation – e.g. it evaluates any
730 proposed new instances of sharing PI with third parties to assess whether they are authorized and
731 whether additional or new public notice is required.

732 Task #16: Specify Exported Privacy Controls


733 Objective Specify the Privacy Controls that must be exported to other Domains or to Systems or
734 Business Processes within Domains.
735 Task 16 Example
736 The Utility exports Customer One’s privacy preferences associated with her PI to its third party partner,
737 whose systems are capable of understanding and enforcing these preferences. One of her Privacy
738 Control requirements is to not share her EVID and any PI associated with the use of the Utility’s vehicle
739 charging system with marketing aggregators or advertisers.

46 PMRM-v1.0-cs02 17 May 2016


47 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 20 of 37
740 4 Identify Services and Functions Necessary to
741 Support Privacy Controls
742 Privacy Controls are usually stated in the form of a policy declaration or requirement and not in a way that
743 is immediately actionable or implementable. Until now, we have been concerned with the real-world,
744 human side of privacy but we need now to turn attention to the procedures, business processes and
745 technical system-level, components that actually enable privacy. Services and their associated Functions
746 provide the bridge between Privacy Controls and a privacy management implementation by instantiating
747 business and system-level actions governing PI.
748
749 Note: The PMRM provides only a high level description of the functionality associated with each Service.
750 A well-developed PMA will provide the detailed functional requirements associated with Services within a
751 specific Use Case.

752 4.1 Services and Functions Needed to Implement the Privacy Controls
753 A set of operational Services and associated Functionality comprise the organizing structure that will be
754 used to establish the linkage between the required Privacy Controls and the operational Mechanisms
755 (both manual and automated) that are necessary to implement those requirements.
756 PMRM identifies eight Privacy Services, necessary to support any set of privacy policies and Controls, at
757 a functional level. The eight Services can be logically grouped into three categories:
758
759  Core Policy: Agreement, Usage
760  Privacy Assurance: Validation, Certification, Enforcement, Security
761  Presentation and Lifecycle: Interaction, Access
762 These groupings, illustrated in Table 1 below, are meant to clarify the “architectural” relationship of the
763 Services in an operational design. However, the functions provided by all Services are available for
764 mutual interaction without restriction.
765
766
Core Policy Privacy Assurance Presentation
Services Services & Lifecycle Services
767
Agreement Validation Certification Interaction

768
Usage Enforcement Security Access

769 Table 1
770 A privacy engineer, system architect or technical manager must be able to define these privacy Services
771 and Functions, and deliver them via procedural and technical Mechanisms. In fact, an important benefit
772 of using the PMRM is to stimulate design and analysis of the specific Mechanisms - both manual and
773 automated - that are needed to implement any set of privacy policies and Controls and their associated
774 Services and Functions. In that sense, the PMRM can be a valuable tool for fostering privacy innovation.

48 PMRM-v1.0-cs02 17 May 2016


49 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 21 of 37
775 The PMRM Services and Functions include important System and Business Process capabilities that are
776 not described in privacy practices and principles. For example, functionality enabling the management of
777 Privacy Policies and their associated Privacy Controls across integrated Systems is implied but not
778 explicitly addressed in privacy principles. Likewise, interfaces and agency are not explicit in the privacy
779 principles, but are necessary to make possible essential operational privacy capabilities.
780 Such inferred capabilities are necessary if information Systems and associated Business Processes are
781 to be made “privacy-configurable and compliant” and to ensure accountability. Without them, enforcing
782 privacy policies in a distributed, fully automated environment will not be possible; businesses, data
783 subjects, and regulators will be burdened with inefficient and error-prone manual processing, inadequate
784 privacy governance, compliance controls and reporting.
785 As used here,
786 - Service is defined as a collection of related Functions that operate for a specified purpose;
787 - Actor is defined as a human or a system-level, digital ‘proxy’ for either a (human) Participant, a (non-
788 human) system-level process or other agent.
789 The eight privacy Services defined are Agreement, Usage, Validation, Certification, Enforcement,
790 Security, Interaction, and Access. These Services represent collections of functionality which
791 make possible the delivery of Privacy Control requirements. The Services are identified as part of the
792 Use Case analysis. Practice with Use Cases has shown that the Services can, together, operationally
793 encompass any arbitrary set of Privacy Control requirements.
794 One Service and its Functions may interact with one or more other Services and their Functions. In other
795 words, Functions under one Service may “call” those under another Service (for example, “pass
796 information to a new Function for subsequent action”). In line with principles of Service-Oriented
797 Architecture (SOA)3, the Services can interact in an arbitrary, interconnected sequence to accomplish a
798 privacy management task or set of privacy lifecycle policy and Control requirements. Use Cases will
799 illustrate such interactions and their sequencing as the PMRM is used to instantiate a particular Privacy
800 Control.
801 Table 2 below provides a description of each Service’s functionality and an informal definition of each
802 Service:

SERVICE FUNCTIONALITY PURPOSE

AGREEMENT Defines and documents permissions and rules for the handling of PI based on Manage and negotiate
applicable policies, data subject preferences, and other relevant factors; provides permissions and rules
relevant Actors with a mechanism to negotiate, change or establish new permissions
and rules; expresses the agreements such that they can be used by other Services

USAGE Ensures that the use of PI complies with the terms of permissions, policies, laws, and Control PI use
regulations, including PI subjected to information minimization, linking, integration,
inference, transfer, derivation, aggregation, anonymization and disposal over the
lifecycle of the PI

VALIDATION Evaluates and ensures the information quality of PI in terms of accuracy, Ensure PI quality
completeness, relevance, timeliness, provenance, appropriateness for use and other
relevant qualitative factors

CERTIFICATION Ensures that the credentials of any Actor, Domain, System, or system component are Ensure appropriate
compatible with their assigned roles in processing PI and verifies their capability to privacy management
support required Privacy Controls in compliance with defined policies and assigned credentials
roles.

ENFORCEMENT Initiates monitoring capabilities to ensure the effective operation of all Services. Monitor proper
Initiates response actions, policy execution, and recourse when audit controls and operation, respond to
monitoring indicate operational faults and failures. Records and reports evidence of exception conditions
compliance to Stakeholders and/or regulators. Provides evidence necessary for and report on demand
Accountability. evidence of compliance
where required for
accountability

3
50 See for example the [SOA-RM] and the [SOA-RAF]
51 PMRM-v1.0-cs02 17 May 2016
52 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 22 of 37
SECURITY Provides the procedural and technical mechanisms necessary to ensure the Safeguard privacy
confidentiality, integrity, and availability of PI; makes possible the trustworthy information and
processing, communication, storage and disposition of PI; safeguards privacy operations
operations

INTERACTION Provides generalized interfaces necessary for presentation, communication, and Information presentation
interaction of PI and relevant information associated with PI, encompassing and communication
functionality such as user interfaces, system-to-system information exchanges, and
agents

ACCESS Enables Data Subjects, as required and/or allowed by permission, policy, or View and propose
regulation, to review their PI that is held within a Domain and propose changes, changes to PI
corrections or deletion for their PI

803 Table 2

804 4.2 Service Details and Function Descriptions


805 4.2.1 Core Policy Services

806 1. Agreement Service


807  Defines and documents permissions and rules for the handling of PI based on applicable policies,
808 individual preferences, and other relevant factors. Provides relevant Actors with a mechanism to
809 negotiate or establish new permissions and rules
810  Expresses the Agreements for use by other Services
811 Agreement Service Example
812 As part of its standard customer service agreement, the Utility requests selected customer PI, with
813 associated permissions for use. Customer negotiates with the Utility (in this case via an electronic
814 interface providing opt-in choices) to modify the permissions. The Customer provides the PI to the
815 Utility, with the modified and agreed-to permissions. This agreement is recorded, stored in an
816 appropriate representation, and the customer provided a copy.

817 2. Usage Service


818  Ensures that the use of PI complies with the terms of any applicable permission, policy, law or
819 regulation,
820 o Including PI subjected to information minimization, linking, integration, inference, transfer,
821 derivation, aggregation, and anonymization,
822 o Over the lifecycle of the PI
823 Usage Service Example
824 A third party has acquired specific PI from the Utility, consistent with contractually agreed permissions
825 for use. The third party has implemented technical functionality capable of enforcing the agreement
826 ensuring that the usage of the PI is consistent with these permissions.

827 4.2.2 Privacy Assurance Services

828 3. Validation Service


829  Evaluates and ensures the information quality of PI in terms of accuracy, completeness,
830 relevance, timeliness and other relevant qualitative factors.
831 Validation Service Example

53 PMRM-v1.0-cs02 17 May 2016


54 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 23 of 37
832 The Utility has implemented a system to validate the vehicle’s VIN and onboard EV ID to ensure
833 accuracy.

834 4. Certification Service


835  Ensures that the credentials of any Actor, Domain, System, or system component are compatible
836 with their assigned roles in processing PI
837  Verifies that an Actor, Domain, System, or system component supports defined policies and
838 conforms with assigned roles
839
840 Certification Service Example
841 The Utility operates a data linkage communicating PI and associated policies with the vehicle
842 manufacturer business partner. The Privacy Officers of both companies ensure that their practices and
843 technical implementations are consistent with their agreed privacy management obligations.
844 Additionally, functionality has been implemented which enables the Utility’s and the manufacturer’s
845 systems to communicate confirmation that updated software versions have been registered and support
846 their agreed upon policies.

847 5. Enforcement Service


848  Initiates monitoring capabilities to ensure the effective operation of all Services
849  Initiates response actions, policy execution, and recourse when audit controls and monitoring
850 indicate operational faults and failures
851  Records and report evidence of compliance to Stakeholders and/or regulators
852  Provides data needed to demonstrate accountability
853
854 Enforcement Service Example
855 The Utility’s maintenance department forwards customer PI to a third party not authorized to receive the
856 information. A routine audit by the Utility’s privacy auditor reveals this unauthorized disclosure practice,
857 alerting the Privacy Officer, who takes appropriate action. This action includes preparation of a Privacy
858 Violation report, together with requirements for remedial action, as well as an assessment of the privacy
859 risk following the unauthorized disclosure. The Utility’s maintenance department keeps records that
860 demonstrate that it only has forwarded customer PI to a third party based upon the agreements with its
861 customers. Such a report may be produced on demand for Stakeholders and regulators.

862 6. Security Service


863  Makes possible the trustworthy processing, communication, storage and disposition of privacy
864 operations
865  Provides the procedural and technical mechanisms necessary to ensure the confidentiality,
866 integrity, and availability of PI
867 Security Service Example
868 PI is encrypted when communicated between the EV, the Utility’s systems and when transmitting PI to
869 its third party to ensure confidentiality.
870 Strong standards-based, identity, authentication and authorization management systems are
871 implemented to conform to the Utility’s data security policies.

55 PMRM-v1.0-cs02 17 May 2016


56 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 24 of 37
872 4.2.3 Presentation and Lifecycle Services

873 7. Interaction Service


874  Provides generalized interfaces necessary for presentation, communication, and interaction of PI
875 and relevant information associated with PI
876  Encompasses functionality such as user interfaces, system-to-system information exchanges,
877 and agents
878
879 Interaction Service Example:
880 The Utility uses a Graphical User Interface (GUI) to communicate with customers, including presenting
881 privacy notices, associated with the EV Charging application, enabling access to PI disclosures, and
882 providing them with options to modify privacy preferences.
883 The Utility utilizes email alerts to notify customers when policies will be changed and uses postal mail to
884 confirm customer-requested changes.

885 8. Access Service


886  Enables data-subjects, as required and/or allowed by permission, policy, or regulation, to review
887 their PI held within a Domain and proposes changes, corrections and/or deletions to it
888 Access Service Example:
889 The Utility has implemented an online service enabling customers to view the Utility systems that collect
890 and use their PI and to interactively manage their privacy preferences for those systems (such as EV
891 Charging) that they have opted to use. For each system, customers are provided the option to view
892 summaries of the PI collected by the Utility and to dispute and correct questionable information.

893 4.3 Identify Services satisfying the Privacy Controls


894 The Services defined in Section 4.1 encompass detailed Functions that are ultimately delivered via
895 Mechanisms (e.g. code, applications, or specific business processes). Such Mechanisms transform the
896 Privacy Controls of section 3.3 into an operational System. Since the detailed Use Case analysis focused
897 on the data flows (Incoming, Internally-Generated, Outgoing) between Systems (and/or Actors), the
898 Service selections should be on the same granular basis.

899 Task #17: Identify the Services and Functions necessary to support
900 operation of identified Privacy Controls
901 Perform this task for each data flow exchange of PI between Systems and Domains.
902 This detailed mapping of Privacy Controls with Services can then be synthesized into consolidated sets of
903 Service and Functions per Domain, System or business environment as appropriate for the Use Case.
904 On further iteration and refinement, the identified Services and Functions can be further delineated by the
905 appropriate Mechanisms.
906 Task 17 Examples
907 1- “Log EV location” based upon
908 a) Internally Generated PI (Current EV location logged by EV On-Board system)
909 b) Outgoing PI (Current EV location transmitted to Utility Load Scheduler System)
910
911 Convert to operational Services as follows:
912 Usage EV On-Board System checks that the reporting of a particular charging location has
913 been opted-in by EV owner per existing Agreement

57 PMRM-v1.0-cs02 17 May 2016


58 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 25 of 37
914 Interaction Communication of EV Location Information to Utility Metering System
915 Enforcement Check that location data has been authorized by EV Owner for reporting and log the
916 action. Notify the Owner for each transaction.
917 Usage EV location data is linked to Agreements

918 2 - “Transmit EV Location to Utility Load Scheduler System”


919 Interaction Communication established between EV Location and ULSS
920 Security Authenticate the ULSS site; authorize the communication; encrypt the transmission
921 Certification ULSS checks the software version of the EV On-Board System to ensure its most
922 recent firmware update maintains compliance with negotiated information storage
923 privacy controls
924 Validation Check the location code and Validate the EV Location against customer- accepted
925 locations

59 PMRM-v1.0-cs02 17 May 2016


60 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 26 of 37
926 5 Define Technical and Procedural Mechanisms
927 Supporting Selected Services and Functions
928 Each Service is composed of a set of Functions, which are delivered operationally by manual and
929 technical Mechanisms
930 The Mechanism step is critical because it requires the identification of specific procedures, applications,
931 technical and vendor solutions, code and other concrete tools that will actually make possible the delivery
932 of required Privacy Controls.

933 5.1 Identify Mechanisms Satisfying the Selected Services and


934 Functions
935 Up to this point in the PMRM methodology, the primary focus of the Use Case analysis has been on the
936 “what:” PI, policies, Privacy Controls, Services and their associated Functions. However, the PMRM
937 methodology also focuses on the “how” – the Mechanisms necessary to deliver the required functionality.

938 Task #18: Identify the Mechanisms that Implement the Identified Services
939 and Functions
940 Examples
941 “Log EV Location”
942 Mechanism: Software Vendor’s DBMS is used as the logging mechanism, and includes active
943 data encryption and key management for security.

944 “Securely Transmit EV Location to Utility Load Scheduler System (ULSS)”


945 Establish a TLS/SSL communication between EV Location and ULSS, including Mechanisms for
946 authentication of the source/destination and authorization of the access.

61 PMRM-v1.0-cs02 17 May 2016


62 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 27 of 37
947 6 Perform Operational Risk and/or Compliance
948 Assessment
949 Task #19: Conduct Risk Assessment
950 Objective Once the requirements in the Use Case have been converted into operational Services,
951 Functions and Mechanisms, an overall risk assessment should be performed from an
952 operational perspective.
953 Note This risk assessment is operational – distinct from other risk assessments, such as the
954 initial assessments leading to choice of privacy policies and selection of privacy controls
955 Additional controls may be necessary to mitigate risks within and across Services. The
956 level of granularity is determined by the Use Case scope and should generally include.
957 operational risk assessments for the selected Services within the Use Case.
958 Examples
959 “Log EV location”:
960 Validation EV On-Board System checks that location is not previously rejected by EV owner
961 Risk: On-board System has been corrupted
962 Enforcement If location is previously rejected, then notify the Owner and/or the Utility
963 Risk: On-board System not current
964
965 EV On-Board System logs the occurrence of the Validation for later reporting on request.
966 Risk: On-board System has inadequate storage for recording the data
967
968 Interaction Communicate EV Location to EV On-Board System
969 Risk: Communication link not available
970 Usage EV On-Board System records EV Location in secure storage, together with agreements
971 Risk: Security controls for On-Board System are compromised
972
973 “Transmit EV Location to Utility Load Scheduler System (ULSS)”:
974 Interaction Communication established between EV Location and ULSS
975 Risk: Communication link down
976 Security Authenticate the ULSS site; secure the transmission
977 Risk: ULSS site credentials are not current
978 Certification ULSS checks the credentials of the EV On-Board System
979 Risk: EV On-Board System credentials do not check
980 Validation Validate the EV Location against accepted locations
981 Risk: System cannot access accepted locations
982 Usage ULSS records the EV Location, together with agreements
983 Risk: Security controls for the ULSS are compromised
984

63 PMRM-v1.0-cs02 17 May 2016


64 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 28 of 37
985 7 Initiate Iterative Process
986 Goal A ‘first pass’ through the Tasks above can be used to identify the scope of the Use Case
987 and the underlying privacy policies. Additional iterative passes would serve to refine the
988 Privacy Controls, Services and Functions, and Mechanisms. Later passes could serve to
989 resolve “TBD” sections that are important, but were not previously developed.
990 Note Iterative passes through the analysis will almost certainly reveal additional, finer-grain
991 details. Keep in mind that the ultimate objective is to develop sufficient insight into the
992 Use Case to provide an operational, Service-based, solution.

993 Task #20: Iterate the analysis and refine


994 Iterate the analysis in the previous sections, seeking further refinement and detail. Continually-iterate the
995 process, as desired, to further refine and detail.

65 PMRM-v1.0-cs02 17 May 2016


66 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 29 of 37
996 8 Conformance
997 8.1 Introduction
998 The PMRM as a “model” is abstract. However, as a Methodology it is through the process of developing
999 a detailed Use Case and a PMA that important levels of detail emerge, enabling a complete picture of
1000 how privacy risks and privacy requirements are being managed. As a Methodology the PMRM – richly
1001 detailed and having multiple, iterative task levels - is intentionally open-ended and can help users build
1002 PMAs at whatever level of complexity they require.
1003 Using the PMRM, detailed privacy service profiles, sector-specific implementation criteria, and
1004 interoperability testing, implemented through explicit, executable, and verifiable methods, can emerge and
1005 may lead to the development of detailed compliance and conformance criteria.
1006 In the meantime, the following statements indicate whether, and if so to what extent, each of the Tasks
1007 outlined in Sections 2 to 7 above, are to be used in a target work product (such as a privacy analysis,
1008 privacy impact assessment, privacy management framework, etc.) in order to claim conformance to the
1009 PMRM, as currently-documented.

1010 8.2 Conformance Statement


1011 The terms “MUST”, “REQUIRED’, “RECOMMENDED’, and “OPTIONAL” are used below in conformance
1012 with [RFC 2119].

1013 Any work product claiming conformance with PMRM v2.0


1014 1. MUST result from the documented performance of the Tasks outlined in Sections 2 to 7 above
1015 and where,
1016 2. Tasks #1-3, 5-18 are REQUIRED;
1017 3. Tasks # 19 and 20 are RECOMMENDED;
1018 4. Task #4 is OPTIONAL.

67 PMRM-v1.0-cs02 17 May 2016


68 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 30 of 37
1019 9 Operational Definitions for Privacy Principles and
1020 Glossary
1021 Note: This section is for information and reference only. It is not part of the normative text of the
1022 document
1023 As explained in the introduction, every specialized Domain is likely to create and use a Domain-specific
1024 vocabulary of concepts and terms that should be used and understood in the specific context of that
1025 Domain. PMRM is no different and this section contains such terms.
1026 In addition, a number of “operational definitions” are included in the PMRM as an aid to support
1027 development of the “Detailed Privacy Use Case Analysis” described in Section 4. Their use is completely
1028 optional, but may be helpful in organizing privacy policies and controls where there are inconsistencies in
1029 definitions across policy boundaries or where existing definitions do not adequately express the
1030 operational characteristics associated with the Privacy Principles below.
1031
1032 These Operational Privacy Principles are intended support the Principles in the OASIS PbD-SE
1033 Specification and may be useful in understanding the operational implications of Privacy Principles
1034 embodied in international laws and regulations and adopted by international organizations

1035 9.1 Operational Privacy Principles


1036 The following 14 Operational Privacy Principles are composite definitions, intended to illustrate the
1037 operational and technical implications of commonly accepted Privacy Principles. They were derived from
1038 a review of international legislative and regulatory instruments (such as the U.S. Privacy Act of 1974 and
1039 the EU Data Protection Directive) in the ISTPA document, “Analysis of Privacy Principles: Making Privacy
1040 Operational,” v2.0 (2007). They have been updated slightly for use in the PMRM. These operational
1041 Privacy Principles can serve as a sample set to assist privacy practitioners. They are “composite”
1042 definitions because there is no single and globally accepted set of Privacy Principles and so each
1043 definition includes the policy expressions associated with each term as found in all 14 instruments.
1044 Accountability
1045 Functionality enabling the ability to ensure and demonstrate compliance with privacy policies to the
1046 various Domain Owners, Stakeholders, regulators and data subjects by the privacy program,
1047 business processes and technical systems.
1048 Notice
1049 Functionality providing Information, in the context of a specified use and in an open and transparent
1050 manner, regarding policies and practices exercised within a Domain including: definition of the
1051 Personal Information collected; its use (purpose specification); its disclosure to parties within or
1052 external to the Domain; practices associated with the maintenance and protection of the information;
1053 options available to the data subject regarding the processor’s privacy practices; retention and
1054 deletion; changes made to policies or practices; and other information provided to the data subject at
1055 designated times and under designated circumstances.
1056 Consent and Choice
1057 Functionality enabling data subjects to agree to the collection and/or specific uses of some or all of
1058 their PI either through an opt-in affirmative process, opt-out, or implied (not choosing to opt-out when
1059 this option is provided). Such functionality may include the capability to support sensitive Information,
1060 informed consent, choices and options, change of use consent, and consequences of consent denial.
1061 Collection Limitation and Information Minimization
1062 Functionality, exercised by the information processor, that limits the personal information collected,
1063 processed, communicated and stored to the minimum necessary to achieve a stated purpose and,
1064 when required, demonstrably collected by fair and lawful means.

69 PMRM-v1.0-cs02 17 May 2016


70 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 31 of 37
1065 Use Limitation
1066 Functionality, exercised by the information processor, that ensures that Personal Information will not
1067 be used for purposes other than those specified and accepted by the data subject or provided by law,
1068 and not maintained longer than necessary for the stated purposes.
1069 Disclosure
1070 Functionality that enables the transfer, provision of access to, use for new purposes, or release in any
1071 manner, of Personal Information managed within a Domain in accordance with notice and consent
1072 permissions and/or applicable laws and functionality making known the information processor’s
1073 policies to external parties receiving the information.
1074 Access, Correction and Deletion
1075 Functionality that allows an adequately identified data subject to discover, correct or delete, Personal
1076 Information managed within a Privacy Domain; functionality providing notice of denial of access;
1077 options for challenging denial when specified; and “right to be forgotten” implementation.
1078 Security/Safeguards
1079 Functionality that ensures the confidentiality, availability and integrity of Personal Information
1080 collected, used, communicated, maintained, and stored; and that ensures specified Personal
1081 Information will be de-identified and/or destroyed as required.
1082 Information Quality
1083 Functionality that ensures that information collected and used is adequate for purpose, relevant for
1084 purpose, accurate at time of use, and, where specified, kept up to date, corrected or destroyed.
1085 Enforcement
1086 Functionality that ensures compliance with privacy policies, agreements and legal requirements and
1087 to give data subjects a means of filing complaints of compliance violations and having them
1088 addressed, including recourse for violations of law, agreements and policies, with optional linkages to
1089 redress and sanctions. Such Functionality includes alerts, audits and security breach management.
1090 Openness
1091 Functionality, available to data subjects, that allows access to an information processor’s notice and
1092 practices relating to the management of their Personal Information and that establishes the existence,
1093 nature, and purpose of use of Personal Information held about the data subject.
1094 Anonymity
1095 Functionality that prevents data being collected or used in a manner that can identify a specific
1096 natural person.
1097 Information Flow
1098 Functionality that enables the communication of personal information across geo-political jurisdictions
1099 by private or public entities involved in governmental, economic, social or other activities in
1100 accordance with privacy policies, agreements and legal requirements.
1101 Sensitivity
1102 Functionality that provides special handling, processing, security treatment or other treatment of
1103 specified information, as defined by law, regulation or policy.

1104 9.2 Glossary


1105 Note: This Glossary does not include the Operational Privacy Principles listed in Section 9.1 above. They
1106 are defined separately given their composite formulation from disparate privacy laws and regulations
1107 Access Service
1108 Enables Data Subjects, as required and/or allowed by permission, policy, or regulation, to review their
1109 PI that is held within a Domain and propose changes, corrections or deletion for their PI
1110 Accountability
1111 Privacy principle intended to ensure that controllers and processors are more generally in control and

71 PMRM-v1.0-cs02 17 May 2016


72 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 32 of 37
1112 in the position to ensure and demonstrate compliance with privacy principles in practice. This may
1113 require the inclusion of business processes and/or technical controls in order to ensure compliance
1114 and provide evidence (such as audit reports) to demonstrate compliance to the various Domain
1115 Owners, Stakeholders, regulators and data subjects.
1116 Agreement Service
1117 Defines and documents permissions and rules for the handling of PI based on applicable policies,
1118 individual preferences, and other relevant factors Provide relevant Actors with a mechanism to
1119 negotiate or establish new permissions and rules. Expresses the Agreements for use by other
1120 Services.
1121 Actor
1122 A human or a system-level, digital ‘proxy’ for either a (human) Participant (or their delegate)
1123 interacting with a system or a (non-human) in-system process or other agent.
1124 Audit Controls
1125 Processes designed to provide reasonable assurance regarding the effectiveness and efficiency of
1126 operations and compliance with applicable policies, laws, and regulations..
1127 Business Process
1128 A business process is a collection of related, structured activities or tasks that produce a specific
1129 service or product (serve a particular goal) for a particular customer or customers within a Use Case.
1130 It may often be visualized as a flowchart of a sequence of activities with interleaving decision points
1131 or as a process matrix of a sequence of activities with relevance rules based on data in the process.
1132 Certification Service
1133 Ensures that the credentials of any Actor, Domain, System, or system component are compatible with
1134 their assigned roles in processing PI and verify their capability to support required Privacy Controls in
1135 compliance with defined policies and assigned roles.
1136 Control
1137 A process designed to provide reasonable assurance regarding the achievement of stated policies,
1138 requirements or objectives.
1139 Data Subject
1140 An identified or identifiable person to who the personal data relate.
1141 Domain
1142 A physical or logical area within the business environment or the Use Case that is subject to the
1143 control of a Domain Owner(s).
1144 Domain Owner
1145 A Participant having responsibility for ensuring that Privacy Controls are implemented and managed
1146 in business processes and technical systems in accordance with policy and requirements.
1147 Enforcement Service
1148 Initiates monitoring capabilities to ensure the effective operation of all Services. Initiates response
1149 actions, policy execution, and recourse when audit controls and monitoring indicate operational faults
1150 and failures. Records and reports evidence of compliance to Stakeholders and/or regulators.
1151 Provides evidence necessary for Accountability.
1152 Exported Privacy Controls
1153 Privacy Controls which must be exported to other Domains or to Systems or Processes within
1154 Domains
1155 Function
1156 Activities or processes within each Service intended to satisfy the Privacy Control
1157 Incoming PI
1158 PI flowing into a Domain, or a System or Business Process within a Domain.

73 PMRM-v1.0-cs02 17 May 2016


74 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 33 of 37
1159 Inherited Privacy Controls
1160 Privacy Controls which are inherited from Domains, or Systems or Business Processes.
1161 Interaction Service
1162 Provides generalized interfaces necessary for presentation, communication, and interaction of PI and
1163 relevant information associated with PI, encompassing functionality such as user interfaces, system-
1164 to-system information exchanges, and agents.
1165 Internally-Generated PI
1166 PI created within the Domain, Business Process or System itself.
1167 Internal Privacy Controls
1168 Privacy Controls which are created within the Domain, Business Process or System itself.
1169 Mechanism
1170 The packaging and implementation of Services and Functions into manual or automated solutions
1171 called Mechanisms.
1172 Monitor
1173 To observe the operation of processes and to indicate when exception conditions occur.
1174 Operational Privacy Principles
1175 A non-normative composite set of Privacy Principle definitions derived from a review of a number of
1176 relevant international legislative and regulatory instruments. They are intended to illustrate the
1177 operational and technical implications of the principles.
1178 Outgoing PI
1179 PI flowing out of one system or business process to another system or business process within a
1180 Doman or to another Domain.
1181 Participant
1182 A Stakeholder creating, managing, interacting with, or otherwise subject to, PI managed by a System
1183 or business process within a Domain or Domains.
1184 PI
1185 Personal Information – any data that describes some attribute of, or that is uniquely associated with,
1186 a natural person.
1187 Note: The PMRM uses this term throughout the document as a proxy for other terminology, such
1188 a PII, personal data, non-public personal financial information, protected health information,
1189 sensitive personal information
1190 PII
1191 Personally-Identifiable Information – any (set of) data that can be used to uniquely identify a natural
1192 person.
1193 Policy
1194 Laws, regulations, contractual terms and conditions, or operational rules or guidance associated with
1195 the collection, use, transmission, storage or destruction of personal information or personally
1196 identifiable information
1197 Privacy Architecture (PA)
1198 An integrated set of policies, Controls, Services and Functions implemented in Mechanisms
1199 appropriate not only for a given Use Case resulting from use of the PMRM but applicable more
1200 broadly for future Use Cases
1201 Privacy by Design (PbD)
1202 Privacy by Design is an approach to systems engineering which takes privacy into account
1203 throughout the whole engineering process. The concept is an example of value sensitive design, i.e.,
1204 to take human values into account in a well-defined matter throughout the whole process and may
1205 have been derived from this. The concept originates in a joint report on “Privacy-enhancing

75 PMRM-v1.0-cs02 17 May 2016


76 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 34 of 37
1206 technologies” by a joint team of the Information and Privacy Commissioner of Ontario, Canada, the
1207 Dutch Data Protection Authority and the Netherlands Organisation for Applied Scientific Research in
1208 1995. (Wikipedia)
1209 Privacy Control
1210 An administrative, technical or physical safeguard employed within an organization or Domain in
1211 order to protect and manage PI.
1212 Privacy Impact Assessment (PIA)
1213 A Privacy Impact Assessment is a tool for identifying and assessing privacy risks throughout the
1214 development life cycle of a program or System.
1215 Privacy Management
1216 The collection of policies, processes and methods used to protect and manage PI.
1217 Privacy Management Analysis (PMA)
1218 Documentation resulting from use of the PMRM and that serves multiple Stakeholders, including
1219 privacy officers, engineers and managers, general compliance managers, and system developers
1220 Privacy Management Reference Model and Methodology (PMRM)
1221 A model and methodology for understanding and analyzing privacy policies and their management
1222 requirements in defined Use Cases; and for selecting the Services and Functions and packaging
1223 them into Mechanisms which must be implemented to support Privacy Controls.
1224 Privacy Policy
1225 Laws, regulations, contractual terms and conditions, or operational rules or guidance associated with
1226 the collection, use, transmission, trans-boarder flows, storage, retention or destruction of Personal
1227 Information or personally identifiable information.
1228 Privacy Principles
1229 Foundational terms which represent expectations, or high level requirements, for protecting personal
1230 information and privacy, and which are organized and defined in multiple laws and regulations, and in
1231 publications by audit and advocacy organizations, and in the work of standards organizations.
1232 Service
1233 A defined collection of related Functions that operate for a specified purpose. For the PMRM, the
1234 eight Services and their Functions, when selected, satisfy Privacy Controls.
1235 Requirement
1236 A requirement is some quality or performance demanded of an entity in accordance with certain fixed
1237 regulations, policies, controls or specified Services, Functions, Mechanisms or Architecture.
1238 Security Service
1239 Provides the procedural and technical mechanisms necessary to ensure the confidentiality, integrity,
1240 and availability of PI; makes possible the trustworthy processing, communication, storage and
1241 disposition of PI; safeguards privacy operations.
1242 Stakeholder
1243 An individual or organization having an interest in the privacy policies, privacy controls, or operational
1244 privacy implementation of a particular Use Case.
1245 System
1246 A collection of components organized to accomplish a specific function or set of functions having a
1247 relationship to operational privacy management.
1248 Touch Point
1249 The intersection of data flows with Actors, Systems or Processes within Domains.
1250 Use Case

77 PMRM-v1.0-cs02 17 May 2016


78 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 35 of 37
1251 In software and systems engineering, a use case is a list of actions or event steps, typically
1252 defining the interactions between a role (known in the Unified Modeling Language as an actor)
1253 and a system, to achieve a goal. The actor can be a human, an external system, or time.
1254 Usage Service
1255 Ensures that the use of PI complies with the terms of permissions, policies, laws, and regulations,
1256 including PI subjected to information minimization, linking, integration, inference, transfer, derivation,
1257 aggregation, anonymization and disposal over the lifecycle of the PI.
1258 Validation Service
1259 Evaluates and ensures the information quality of PI in terms of accuracy, completeness, relevance,
1260 timeliness, provenance, appropriateness for use and other relevant qualitative factors.

1261 9.3 PMRM Acronyms


1262 CPUC California Public Utility Commission
1263 DBMS Data Base Management System
1264 EU European Union
1265 EV Electric Vehicle
1266 GUI Graphical User Interface
1267 IoT Internet of Things
1268 NIST National Institute of Standards and Technology
1269 OASIS Organization for the Advancement of Structured Information Standards
1270 PA Privacy Architecture
1271 PbD Privacy by Design
1272 PbD-SE Privacy by Design Documentation for Software Engineers
1273 PI Personal Information
1274 PII Personally Identifiable Information
1275 PIA Privacy Impact Assessment
1276 PMA Privacy Management Analysis
1277 PMRM Privacy Management Reference Model and Methodology
1278 PMRM TC Privacy Management Reference Model Technical Committee
1279 RFC Request for Comment
1280 SOA Service Oriented Architecture
1281 TC Technical Committee
1282 ULSS Utility Load Scheduler System

79 PMRM-v1.0-cs02 17 May 2016


80 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 36 of 37
1283 Appendix A. Acknowledgments
1284 The following individuals have participated in the creation of this specification and are gratefully
1285 acknowledged:
1286 PMRM V1.0 CS01 Participants:
1287
1288 Peter F Brown, Individual Member
1289 Gershon Janssen, Individual Member
1290 Dawn Jutla, Saint Mary’s University
1291 Gail Magnuson, Individual Member
1292 Joanne McNabb, California Office of Privacy Protection
1293 John Sabo, Individual Member
1294 Stuart Shapiro, MITRE Corporation
1295 Michael Willett, Individual Member
1296
1297 PMRM V1.0 CS02 Participants:
1298 Michele Drgon, Individual Member
1299 Gershon Janssen, Individual Member
1300 Dawn Jutla, Saint Mary’s University
1301 Gail Magnuson, Individual Member
1302 Nicolas Notario O’Donnell
1303 John Sabo, Individual Member
1304 Michael Willett, Individual Member

81 PMRM-v1.0-cs02 17 May 2016


82 Standards Track Work Product Copyright © OASIS Open 2016. All Rights Reserved. Page 37 of 37

You might also like