SlideShare a Scribd company logo
Self-Evaluation Workshop Kickoff
Organization Name
Agenda
1. Opening Remarks
2. C2M2 Overview
3. Self-Evaluation Overview
4. Scoring and Interpretation
5. Questions
- 2 -
Facilitator Opening
Remarks
Facilitator Name, Title, Organizational Business Unit
Sponsor Opening
Remarks
Sponsor Name, Title, Organizational Business Unit
Cybersecurity
Capability Maturity
Model (C2M2)
Overview
C2M2 Version 2.1 Overview
- 6 -
The C2M2 is a free tool to help organizations
evaluate their cybersecurity capabilities and
optimize their security investments.
• Designed for any organization regardless of
ownership, structure, size, or industry
• Uses a set of 350+ industry-vetted
cybersecurity practices focused on both
information technology (IT) and operations
technology (OT) assets and environments
• Results help users prioritize cybersecurity
investment decisions based on their risk
• Developed in 2012 and maintained
through an extensive public-private
partnership between the U.S. Department
of Energy’s Office of Cybersecurity, Energy
Security, and Emergency Response and
numerous government, industry, and
academic organizations
• Recent updates in 2022 reflect new
technologies, threats, and practices
Key Features of the C2M2
Area Description
Maturity
Model
The C2M2 consists of cybersecurity practices that are organized into three
progressive levels of cybersecurity maturity.
Management
Activities
Management activities measure the extent to which cybersecurity is ingrained in
an organization’s culture.
Specificity
The C2M2 is descriptive, not prescriptive. Practice statements focus on outcomes
that may be implemented through any number of measures.
Scoping
The C2M2 may be applied to an entire enterprise or to individual parts of the
enterprise to enable users to select an appropriate level of granularity.
Usability
A C2M2 self-evaluation can be completed in one-day using a free tool that securely
records results and generates a detailed, graphical report.
- 7 -
Model Organized by 10 Domains
• Domains are logical groupings of cybersecurity
practices
• Each domain has a short name for ease of
reference
ASSET
Asset, Change, and
Configuration
Management
THREAT
Threat and
Vulnerability
Management
RISK
Risk Management
ACCESS
Identity and Access
Management
SITUATION
Situational
Awareness
RESPONSE Event and Incident
Response,
Continuity of
Operations
THIRD-PARTIES
Third-Party Risk
Management
WORKFORCE
Workforce
Management
ARCHITECTURE
Cybersecurity
Architecture
PROGRAM
Cybersecurity
Program
Management
- 8 -
Model Structure
- 9 -
Model contains 10 domains
Multiple approach objectives in each domain
Unique to each domain
One per domain
Similar in each domain
Approach objectives are supported by a
progression of practices that are unique to the
domain
Each management objective is supported by a
progression of practices that are similar in each
domain and describe institutionalization activities
Model
Domain
Approach Objectives
Practices at MIL1
Practices at MIL2
Practices at MIL3
Management Objectives
Practices at MIL2
Practices at MIL3
Maturity Indicator Level Descriptions
Level Description
MIL0 Practices are not performed
MIL1
Initial practices are performed but may be ad hoc (performance depends largely on the initiative and experience
of the individual or team)
MIL2
Management Characteristics
Practices are documented
Adequate resources are provided to support the process
Approach Characteristic
Practices are more complete or advanced than at MIL1
MIL3
Management Characteristics
Activities are guided by policies (or other organizational directives)
Responsibility, accountability, and authority for performing the practices are assigned
Personnel performing the practices have adequate skills and knowledge
The effectiveness of activities in the domain is evaluated and tracked
Approach Characteristic
Practices are more complete or advanced than at MIL2
- 10 -
Maturity Indicator Level Considerations
• Maturity Indicator levels apply independently to each domain
• MILs are cumulative within each domain
• To achieve MIL3 in a domain, the organization must perform the MIL1, MIL2,
and MIL3 practices in the domain
- 11 -
Model at a Glance
MIL3
MIL2
MIL1
Asset
Threat
Risk
Access
Situation
Response
Third-Parties
Workforce
Architecture
Program
Three maturity indicator levels (MIL): Defined progressions of practices
MIL1 Practices for RISK domain
10 Model Domains: Logical Groupings of Cybersecurity Practices
- 12 -
C2M2 Architecture – ACCESS Domain
Example
Domain Name
Domain Objectives
Practices organized
by objectives and
MILs
- 13 -
Using the Model
• Perform a Self-Evaluation: determine the
implementation of cybersecurity activities
within the organization
• Analyze Identified Gaps: determine whether
the gaps are meaningful and important and
should be addressed
• Prioritize and Plan: prioritize the actions
needed to fully implement the practices to
achieve the desired capability
• Implement Plan: implement the plans defined
in the previous step to address the identified
gaps
- 14 -
Self-Evaluation
Overview
Self-Evaluation Roles
Number Role Names
1 Sponsor
2 Organizer
3 Facilitator
4 SMEs
5 Observers
6 Scribe
7 Support Staff
8 Participants
- 16 -
Defining the Scope of the Self-Evaluation
Identify the function in scope for the self-evaluation
• “Function” is the part of the organization to which the C2M2 is applied
• Consider the level of similarity of security management when selecting a function
• Consider the systems and organizations that share interdependencies with the
function
A self-evaluation for the in-scope function would include
• Assets important to the delivery of the function
• Assets within the function that may be leveraged to achieve a threat objective
• Other potential assets within the function (e.g., data in the cloud, mobile devices)
• Third-parties that have access to, control of, or custody of the function’s assets
• Third-parties upon which the function depends
- 17 -
Agreed-Upon Function and Scope
• Placeholder for the organization performing the self-evaluation to describe
the organization’s identified function in-scope for the self-evaluation
- 18 -
Self-Evaluation Process
• The facilitator will assist the group in reaching consensus on responses
• Select the response that best matches the level of implementation for each
practice
• If consensus cannot be reached, moving on and returning later may help
facilitate consensus
- 19 -
Self-Evaluation Tool
• The self-evaluation tool will not
create a self-evaluation report unless
a response is selected for every
practice
• If the organization has chosen to
not implement a practice, select
Not Implemented
• Use the notes field to capture
important information (e.g., rationale
for responses) that may be of use
later in explaining self-evaluation
results
- 20 -
Response Options
Response Description
Fully Implemented (FI) Complete
Largely Implemented (LI) Complete, but with a recognized opportunity for improvement
Partially Implemented (PI) Incomplete; there are multiple opportunities for improvement
Not Implemented (NI) Absent; the practice is not performed by the organization
- 21 -
Selecting Responses
• Consider key phrases in the practices
• Examples of key phrases
• “periodically and according to defined triggers”
• “is established and maintained”
• Use the clickable pop-up definitions of key terms or the glossary to help
answer specific questions
• Use the help text to help answer specific questions about practices
• Responses should reflect the current state of the organization
• Consider documenting in-progress or planned projects in the notes field
• Consider updating your self-evaluation when in-progress or planned projects
are complete or at a set period, such as every six months
- 22 -
Asset Types Included in the C2M2
Asset: something of value to the organization
• IT asset: hardware and software
• OT asset: hardware and software
• Information asset: information essential
for operating the function
Consider these three types of asset when
C2M2 practices refer to “assets.”
Examples of Assets
IT Assets OT Assets Information Assets
• Workstations
• Switches, routers,
and firewalls
• Servers
• Virtual machines
• Software
• Workstations
• Switches, routers,
and firewalls
• Servers
• Virtual machines
• Software
• Programmable
Logic Controllers
• HMI Stations
• RTUs
• Actuators
• Sensors
• Business data
• Intellectual
property
• Customer
information
• Contracts
• Security logs
• Metadata
- 23 -
Ad Hoc Practices
• By design, MIL1 practices may be implemented in an ad hoc manner and still be
considered Fully Implemented.
• Performance of ad hoc practices may depend on the initiative and experience
of an individual or team, without much organizational guidance (e.g., policy or
procedures)
• The quality of the outcome may vary significantly depending on who is
performing the practice and other factors such as the methods or tools used
• Lessons learned may not be captured and outcomes may be difficult to repeat
• Institutional knowledge is not captured in documented policy, procedures, or
work instructions
• The practice may not be retained over time or under times of stress
• Implementation of ad hoc practices, like any other practice, may result in
documented artifacts (e.g., asset inventories, a cybersecurity program strategy)
- 24 -
Practice With Dependencies
• There are several practices that, depending on how they are scored, inform
the logical response for certain other practices.
• Practices that contain dependencies reference the related practice in
parentheses within the text of the practice.
• For example, THREAT 2j is dependent upon SITUATION 3g. THREAT-2j states
“Threat monitoring and response activities leverage and trigger predefined
states of operation (SITUATION-3g).”
• If SITUATION 3g is scored as Not Implemented (i.e., the organization has not
identified predefined states of operation), then THREAT-2j should not be
scored as Largely Implemented or Fully Implemented.
- 25 -
Practices with Compound Statements
• If some but not all activities in a practice are implemented, the response
should reflect the level of implementation (i.e., it should not be considered
Fully Implemented)
• For example, ARCHITECTURE-1b states “A strategy for cybersecurity
architecture is established and maintained in alignment with the
organization’s cybersecurity program strategy (PROGRAM-1b) and enterprise
architecture”
• If the strategy exists but is not routinely updated, Largely Implemented
should be considered instead of Fully Implemented
- 26 -
Practice Progressions
• The model includes sets of related practices that form increasingly complete,
comprehensive, or advanced implementations of an activity
• These are called practice progressions and are noted in the help text
• These practice progressions mostly start with ad hoc practices in MIL1 that
serve as a foundation upon which the other practices build
• When evaluating a practice in a progression, implementation of prior
practices in the progression should be considered
- 27 -
Practice Progression Example
ASSET-1a, ASSET-1b, ASSET-1f, and ASSET-1g form a progression.
These practices are about the IT and OT asset inventory being created and growing in
completeness and accuracy.
- 28 -
Scoring and
Interpretation
Sample MIL Achievement by Domain
• A summary bar chart provides a quick
indication of the MIL achieved in each
domain.
• In this example:
• MIL3 has been achieved in ACCESS.
• MIL2 has been achieved in
SITUATION, WORKFORCE, and
PROGRAM.
• MIL1 has been achieved in all other
domains.
- 30 -
Sample Summary Score MIL1
MIL1
MIL2
MIL3
There are 3 practices at
MIL1 for the SITUATION
domain
The colors and numbers in the ring summarize
implementation level of those practices; in this
case, 2 practices at MIL1 are Fully Implemented
and 1 is Largely Implemented
- 31 -
Sample Summary Score MIL2
MIL1
MIL2
MIL3
There are 16 cumulative
practices at MIL2; 3 at
MIL1 and 13 at MIL2
2 practices are Fully Implemented
9 practices are Largely Implemented
5 practices are Partially Implemented
- 32 -
Sample Summary Score
• This figure shows summarized results for every practice, grouped by domain
• To achieve a MIL in a domain, all practices in that MIL and in all preceding
MILs must have responses of either Fully Implemented or Largely
Implemented
- 33 -
Summary of Management Activities
• Management Activities
focus on the extent to which
cybersecurity practices are
institutionalized, or
ingrained, in the
organization’s operations
• This chart offers an
overview from two
perspectives:
1) implementation within
each domain
2) implementation across
the 10 domains
- 34 -
Questions?
Thank You

More Related Content

PPTX
C2M2 V2.1 Self-Evaluation Workshop Kickoff.pptx
PPTX
C2M2 V2.1 Overview Presentation -- July 2023.pptx
PPT
ch11.ppt
PPTX
Chapter 4 Requirements ModelInformation Technology Project Management - part ...
PDF
chapter04-120827115356-phpapp01.pdf
PDF
NIST Cybersecurity Framework Intro for ISACA Richmond Chapter
PDF
Object oriented analysis and design unit- i
C2M2 V2.1 Self-Evaluation Workshop Kickoff.pptx
C2M2 V2.1 Overview Presentation -- July 2023.pptx
ch11.ppt
Chapter 4 Requirements ModelInformation Technology Project Management - part ...
chapter04-120827115356-phpapp01.pdf
NIST Cybersecurity Framework Intro for ISACA Richmond Chapter
Object oriented analysis and design unit- i

Similar to C2M2 V2.1 Self-Evaluation Workshop Kickoff Presentation -- July 2023.pptx (20)

PPTX
Software development o & c
PDF
Nist cybersecurity framework isc2 quantico
PPTX
Managing an enterprise cyber security program
PPTX
Chapter 11 Managing Systems Implementation .pptx
PDF
Implementing AppSec Policies with TeamMentor
PPT
Risk Based Security and Self Protection Powerpoint
PPTX
framework-version-1.1-overview-20180427-for-web-002.pptx
PPTX
CompTIA Security+.pptx
PDF
Structured NERC CIP Process Improvement Using Six Sigma
PDF
Introduction to NIST Cybersecurity Framework
PDF
chapter11-120827115420-phpapp01.pdf
PPTX
Software quality system - Quality Engineering
PPTX
Comptia Security+ Class Slides of Study Guide.pptx
PPTX
Chapter 04
PPTX
Maturity Model Presentation from Open Compliance Summit 2023
PPTX
Maturity Models - Open Compliance Summit 2023
PDF
software requirement
PPT
Introduction to System Analysis and Design
DOC
Unit Iii
PPTX
System analysis and design Part2
Software development o & c
Nist cybersecurity framework isc2 quantico
Managing an enterprise cyber security program
Chapter 11 Managing Systems Implementation .pptx
Implementing AppSec Policies with TeamMentor
Risk Based Security and Self Protection Powerpoint
framework-version-1.1-overview-20180427-for-web-002.pptx
CompTIA Security+.pptx
Structured NERC CIP Process Improvement Using Six Sigma
Introduction to NIST Cybersecurity Framework
chapter11-120827115420-phpapp01.pdf
Software quality system - Quality Engineering
Comptia Security+ Class Slides of Study Guide.pptx
Chapter 04
Maturity Model Presentation from Open Compliance Summit 2023
Maturity Models - Open Compliance Summit 2023
software requirement
Introduction to System Analysis and Design
Unit Iii
System analysis and design Part2
Ad

More from ZharfanHanif (11)

PDF
CYFIRMA_prezo_CyberSecID deck presentation.pdf
PDF
sample backup systemsample backup system
PPTX
sample Topologi DRCsample Topologi DRCsample Topologi DRC
PDF
sample Topologi rev datacenter disaster recovery center
PDF
Memahami Tabiat Anak dan kewajiban anak pdf
PDF
1. Berinteraksi Bersama Anak Dengan Cara yang Baik.pdf
PPT
cyber security incident exercises TTX .ppt
PDF
Tenable-Deck-Customer presentation- Shared
PPTX
ATM Sec.pptx
PPTX
Palo Alto VM series Deploymen Strategy.pptx
PPTX
CTEK-Investor-Presentation-May-2021-1.pptx
CYFIRMA_prezo_CyberSecID deck presentation.pdf
sample backup systemsample backup system
sample Topologi DRCsample Topologi DRCsample Topologi DRC
sample Topologi rev datacenter disaster recovery center
Memahami Tabiat Anak dan kewajiban anak pdf
1. Berinteraksi Bersama Anak Dengan Cara yang Baik.pdf
cyber security incident exercises TTX .ppt
Tenable-Deck-Customer presentation- Shared
ATM Sec.pptx
Palo Alto VM series Deploymen Strategy.pptx
CTEK-Investor-Presentation-May-2021-1.pptx
Ad

Recently uploaded (20)

PDF
The Digital Culture Challenge; Bridging the Employee-Leadership Disconnect
PPTX
5 Stages of group development guide.pptx
PPTX
BIS-Certification-for-CCTV-Recorders ppt.pptx
PDF
WRN_Investor_Presentation_August 2025.pdf
PDF
MSPs in 10 Words - Created by US MSP Network
PPTX
Dragon_Fruit_Cultivation_in Nepal ppt.pptx
PDF
COST SHEET- Tender and Quotation unit 2.pdf
PPTX
Buy Chaos Software – V-Ray, Enscape & Vantage Licenses in India
PPTX
How Medical Call Centers Drive Patient Acquisition Through Multichannel Outre...
PDF
TriStar Gold Corporate Presentation August 2025
PPTX
GenAI at FinSage Financial Wellness Platform
PPTX
Untitled presentation (2).quiz presention
DOCX
unit 2 cost accounting- Tender and Quotation & Reconciliation Statement
PPTX
BUSINESS FINANCE POWER POINT PRESENTATION
PDF
Lecture 3 - Risk Management and Compliance.pdf
PDF
KornFerry Presentation hbkjbkjbk bjkbkbk.pdf
PDF
HOT DAY CAFE , Café Royale isn’t just another coffee shop
PPTX
HR Introduction Slide (1).pptx on hr intro
PPTX
Helicopters in the Brazilian Oil Industry – Executive Summary
PPTX
Nagarajan Seyyadurai – Visionary Leadership at WS Industries.pptx
The Digital Culture Challenge; Bridging the Employee-Leadership Disconnect
5 Stages of group development guide.pptx
BIS-Certification-for-CCTV-Recorders ppt.pptx
WRN_Investor_Presentation_August 2025.pdf
MSPs in 10 Words - Created by US MSP Network
Dragon_Fruit_Cultivation_in Nepal ppt.pptx
COST SHEET- Tender and Quotation unit 2.pdf
Buy Chaos Software – V-Ray, Enscape & Vantage Licenses in India
How Medical Call Centers Drive Patient Acquisition Through Multichannel Outre...
TriStar Gold Corporate Presentation August 2025
GenAI at FinSage Financial Wellness Platform
Untitled presentation (2).quiz presention
unit 2 cost accounting- Tender and Quotation & Reconciliation Statement
BUSINESS FINANCE POWER POINT PRESENTATION
Lecture 3 - Risk Management and Compliance.pdf
KornFerry Presentation hbkjbkjbk bjkbkbk.pdf
HOT DAY CAFE , Café Royale isn’t just another coffee shop
HR Introduction Slide (1).pptx on hr intro
Helicopters in the Brazilian Oil Industry – Executive Summary
Nagarajan Seyyadurai – Visionary Leadership at WS Industries.pptx

C2M2 V2.1 Self-Evaluation Workshop Kickoff Presentation -- July 2023.pptx

  • 2. Agenda 1. Opening Remarks 2. C2M2 Overview 3. Self-Evaluation Overview 4. Scoring and Interpretation 5. Questions - 2 -
  • 3. Facilitator Opening Remarks Facilitator Name, Title, Organizational Business Unit
  • 4. Sponsor Opening Remarks Sponsor Name, Title, Organizational Business Unit
  • 6. C2M2 Version 2.1 Overview - 6 - The C2M2 is a free tool to help organizations evaluate their cybersecurity capabilities and optimize their security investments. • Designed for any organization regardless of ownership, structure, size, or industry • Uses a set of 350+ industry-vetted cybersecurity practices focused on both information technology (IT) and operations technology (OT) assets and environments • Results help users prioritize cybersecurity investment decisions based on their risk • Developed in 2012 and maintained through an extensive public-private partnership between the U.S. Department of Energy’s Office of Cybersecurity, Energy Security, and Emergency Response and numerous government, industry, and academic organizations • Recent updates in 2022 reflect new technologies, threats, and practices
  • 7. Key Features of the C2M2 Area Description Maturity Model The C2M2 consists of cybersecurity practices that are organized into three progressive levels of cybersecurity maturity. Management Activities Management activities measure the extent to which cybersecurity is ingrained in an organization’s culture. Specificity The C2M2 is descriptive, not prescriptive. Practice statements focus on outcomes that may be implemented through any number of measures. Scoping The C2M2 may be applied to an entire enterprise or to individual parts of the enterprise to enable users to select an appropriate level of granularity. Usability A C2M2 self-evaluation can be completed in one-day using a free tool that securely records results and generates a detailed, graphical report. - 7 -
  • 8. Model Organized by 10 Domains • Domains are logical groupings of cybersecurity practices • Each domain has a short name for ease of reference ASSET Asset, Change, and Configuration Management THREAT Threat and Vulnerability Management RISK Risk Management ACCESS Identity and Access Management SITUATION Situational Awareness RESPONSE Event and Incident Response, Continuity of Operations THIRD-PARTIES Third-Party Risk Management WORKFORCE Workforce Management ARCHITECTURE Cybersecurity Architecture PROGRAM Cybersecurity Program Management - 8 -
  • 9. Model Structure - 9 - Model contains 10 domains Multiple approach objectives in each domain Unique to each domain One per domain Similar in each domain Approach objectives are supported by a progression of practices that are unique to the domain Each management objective is supported by a progression of practices that are similar in each domain and describe institutionalization activities Model Domain Approach Objectives Practices at MIL1 Practices at MIL2 Practices at MIL3 Management Objectives Practices at MIL2 Practices at MIL3
  • 10. Maturity Indicator Level Descriptions Level Description MIL0 Practices are not performed MIL1 Initial practices are performed but may be ad hoc (performance depends largely on the initiative and experience of the individual or team) MIL2 Management Characteristics Practices are documented Adequate resources are provided to support the process Approach Characteristic Practices are more complete or advanced than at MIL1 MIL3 Management Characteristics Activities are guided by policies (or other organizational directives) Responsibility, accountability, and authority for performing the practices are assigned Personnel performing the practices have adequate skills and knowledge The effectiveness of activities in the domain is evaluated and tracked Approach Characteristic Practices are more complete or advanced than at MIL2 - 10 -
  • 11. Maturity Indicator Level Considerations • Maturity Indicator levels apply independently to each domain • MILs are cumulative within each domain • To achieve MIL3 in a domain, the organization must perform the MIL1, MIL2, and MIL3 practices in the domain - 11 -
  • 12. Model at a Glance MIL3 MIL2 MIL1 Asset Threat Risk Access Situation Response Third-Parties Workforce Architecture Program Three maturity indicator levels (MIL): Defined progressions of practices MIL1 Practices for RISK domain 10 Model Domains: Logical Groupings of Cybersecurity Practices - 12 -
  • 13. C2M2 Architecture – ACCESS Domain Example Domain Name Domain Objectives Practices organized by objectives and MILs - 13 -
  • 14. Using the Model • Perform a Self-Evaluation: determine the implementation of cybersecurity activities within the organization • Analyze Identified Gaps: determine whether the gaps are meaningful and important and should be addressed • Prioritize and Plan: prioritize the actions needed to fully implement the practices to achieve the desired capability • Implement Plan: implement the plans defined in the previous step to address the identified gaps - 14 -
  • 16. Self-Evaluation Roles Number Role Names 1 Sponsor 2 Organizer 3 Facilitator 4 SMEs 5 Observers 6 Scribe 7 Support Staff 8 Participants - 16 -
  • 17. Defining the Scope of the Self-Evaluation Identify the function in scope for the self-evaluation • “Function” is the part of the organization to which the C2M2 is applied • Consider the level of similarity of security management when selecting a function • Consider the systems and organizations that share interdependencies with the function A self-evaluation for the in-scope function would include • Assets important to the delivery of the function • Assets within the function that may be leveraged to achieve a threat objective • Other potential assets within the function (e.g., data in the cloud, mobile devices) • Third-parties that have access to, control of, or custody of the function’s assets • Third-parties upon which the function depends - 17 -
  • 18. Agreed-Upon Function and Scope • Placeholder for the organization performing the self-evaluation to describe the organization’s identified function in-scope for the self-evaluation - 18 -
  • 19. Self-Evaluation Process • The facilitator will assist the group in reaching consensus on responses • Select the response that best matches the level of implementation for each practice • If consensus cannot be reached, moving on and returning later may help facilitate consensus - 19 -
  • 20. Self-Evaluation Tool • The self-evaluation tool will not create a self-evaluation report unless a response is selected for every practice • If the organization has chosen to not implement a practice, select Not Implemented • Use the notes field to capture important information (e.g., rationale for responses) that may be of use later in explaining self-evaluation results - 20 -
  • 21. Response Options Response Description Fully Implemented (FI) Complete Largely Implemented (LI) Complete, but with a recognized opportunity for improvement Partially Implemented (PI) Incomplete; there are multiple opportunities for improvement Not Implemented (NI) Absent; the practice is not performed by the organization - 21 -
  • 22. Selecting Responses • Consider key phrases in the practices • Examples of key phrases • “periodically and according to defined triggers” • “is established and maintained” • Use the clickable pop-up definitions of key terms or the glossary to help answer specific questions • Use the help text to help answer specific questions about practices • Responses should reflect the current state of the organization • Consider documenting in-progress or planned projects in the notes field • Consider updating your self-evaluation when in-progress or planned projects are complete or at a set period, such as every six months - 22 -
  • 23. Asset Types Included in the C2M2 Asset: something of value to the organization • IT asset: hardware and software • OT asset: hardware and software • Information asset: information essential for operating the function Consider these three types of asset when C2M2 practices refer to “assets.” Examples of Assets IT Assets OT Assets Information Assets • Workstations • Switches, routers, and firewalls • Servers • Virtual machines • Software • Workstations • Switches, routers, and firewalls • Servers • Virtual machines • Software • Programmable Logic Controllers • HMI Stations • RTUs • Actuators • Sensors • Business data • Intellectual property • Customer information • Contracts • Security logs • Metadata - 23 -
  • 24. Ad Hoc Practices • By design, MIL1 practices may be implemented in an ad hoc manner and still be considered Fully Implemented. • Performance of ad hoc practices may depend on the initiative and experience of an individual or team, without much organizational guidance (e.g., policy or procedures) • The quality of the outcome may vary significantly depending on who is performing the practice and other factors such as the methods or tools used • Lessons learned may not be captured and outcomes may be difficult to repeat • Institutional knowledge is not captured in documented policy, procedures, or work instructions • The practice may not be retained over time or under times of stress • Implementation of ad hoc practices, like any other practice, may result in documented artifacts (e.g., asset inventories, a cybersecurity program strategy) - 24 -
  • 25. Practice With Dependencies • There are several practices that, depending on how they are scored, inform the logical response for certain other practices. • Practices that contain dependencies reference the related practice in parentheses within the text of the practice. • For example, THREAT 2j is dependent upon SITUATION 3g. THREAT-2j states “Threat monitoring and response activities leverage and trigger predefined states of operation (SITUATION-3g).” • If SITUATION 3g is scored as Not Implemented (i.e., the organization has not identified predefined states of operation), then THREAT-2j should not be scored as Largely Implemented or Fully Implemented. - 25 -
  • 26. Practices with Compound Statements • If some but not all activities in a practice are implemented, the response should reflect the level of implementation (i.e., it should not be considered Fully Implemented) • For example, ARCHITECTURE-1b states “A strategy for cybersecurity architecture is established and maintained in alignment with the organization’s cybersecurity program strategy (PROGRAM-1b) and enterprise architecture” • If the strategy exists but is not routinely updated, Largely Implemented should be considered instead of Fully Implemented - 26 -
  • 27. Practice Progressions • The model includes sets of related practices that form increasingly complete, comprehensive, or advanced implementations of an activity • These are called practice progressions and are noted in the help text • These practice progressions mostly start with ad hoc practices in MIL1 that serve as a foundation upon which the other practices build • When evaluating a practice in a progression, implementation of prior practices in the progression should be considered - 27 -
  • 28. Practice Progression Example ASSET-1a, ASSET-1b, ASSET-1f, and ASSET-1g form a progression. These practices are about the IT and OT asset inventory being created and growing in completeness and accuracy. - 28 -
  • 30. Sample MIL Achievement by Domain • A summary bar chart provides a quick indication of the MIL achieved in each domain. • In this example: • MIL3 has been achieved in ACCESS. • MIL2 has been achieved in SITUATION, WORKFORCE, and PROGRAM. • MIL1 has been achieved in all other domains. - 30 -
  • 31. Sample Summary Score MIL1 MIL1 MIL2 MIL3 There are 3 practices at MIL1 for the SITUATION domain The colors and numbers in the ring summarize implementation level of those practices; in this case, 2 practices at MIL1 are Fully Implemented and 1 is Largely Implemented - 31 -
  • 32. Sample Summary Score MIL2 MIL1 MIL2 MIL3 There are 16 cumulative practices at MIL2; 3 at MIL1 and 13 at MIL2 2 practices are Fully Implemented 9 practices are Largely Implemented 5 practices are Partially Implemented - 32 -
  • 33. Sample Summary Score • This figure shows summarized results for every practice, grouped by domain • To achieve a MIL in a domain, all practices in that MIL and in all preceding MILs must have responses of either Fully Implemented or Largely Implemented - 33 -
  • 34. Summary of Management Activities • Management Activities focus on the extent to which cybersecurity practices are institutionalized, or ingrained, in the organization’s operations • This chart offers an overview from two perspectives: 1) implementation within each domain 2) implementation across the 10 domains - 34 -

Editor's Notes

  • #3: Placeholder discussion topics Sets the expectation for the workshop Roles of the participants
  • #4: Placeholder discussion topics Emphasis the importance of the effort and accuracy of responses Discuss how the results will be used within the organization’s overall cybersecurity program Note that next steps will be based on the organization’s risks and maturity Highlight that the output of the C2M2 self-evaluation should drive risk conversations and allow organizations to plan periodic reviews of their cybersecurity program to track progress and validate goals. Point out the roles of participants in follow-up activities.
  • #16: Placeholder for listing roles involved in the Self-Evaluation
  • #20: Select the screenshot of the chosen tool