6. C2M2 Version 2.1 Overview
- 6 -
The C2M2 is a free tool to help organizations
evaluate their cybersecurity capabilities and
optimize their security investments.
• Designed for any organization regardless of
ownership, structure, size, or industry
• Uses a set of 350+ industry-vetted
cybersecurity practices focused on both
information technology (IT) and operations
technology (OT) assets and environments
• Results help users prioritize cybersecurity
investment decisions based on their risk
• Developed in 2012 and maintained
through an extensive public-private
partnership between the U.S. Department
of Energy’s Office of Cybersecurity, Energy
Security, and Emergency Response and
numerous government, industry, and
academic organizations
• Recent updates in 2022 reflect new
technologies, threats, and practices
7. Key Features of the C2M2
Area Description
Maturity
Model
The C2M2 consists of cybersecurity practices that are organized into three
progressive levels of cybersecurity maturity.
Management
Activities
Management activities measure the extent to which cybersecurity is ingrained in
an organization’s culture.
Specificity
The C2M2 is descriptive, not prescriptive. Practice statements focus on outcomes
that may be implemented through any number of measures.
Scoping
The C2M2 may be applied to an entire enterprise or to individual parts of the
enterprise to enable users to select an appropriate level of granularity.
Usability
A C2M2 self-evaluation can be completed in one-day using a free tool that securely
records results and generates a detailed, graphical report.
- 7 -
8. Model Organized by 10 Domains
• Domains are logical groupings of cybersecurity
practices
• Each domain has a short name for ease of
reference
ASSET
Asset, Change, and
Configuration
Management
THREAT
Threat and
Vulnerability
Management
RISK
Risk Management
ACCESS
Identity and Access
Management
SITUATION
Situational
Awareness
RESPONSE Event and Incident
Response,
Continuity of
Operations
THIRD-PARTIES
Third-Party Risk
Management
WORKFORCE
Workforce
Management
ARCHITECTURE
Cybersecurity
Architecture
PROGRAM
Cybersecurity
Program
Management
- 8 -
9. Model Structure
- 9 -
Model contains 10 domains
Multiple approach objectives in each domain
Unique to each domain
One per domain
Similar in each domain
Approach objectives are supported by a
progression of practices that are unique to the
domain
Each management objective is supported by a
progression of practices that are similar in each
domain and describe institutionalization activities
Model
Domain
Approach Objectives
Practices at MIL1
Practices at MIL2
Practices at MIL3
Management Objectives
Practices at MIL2
Practices at MIL3
10. Maturity Indicator Level Descriptions
Level Description
MIL0 Practices are not performed
MIL1
Initial practices are performed but may be ad hoc (performance depends largely on the initiative and experience
of the individual or team)
MIL2
Management Characteristics
Practices are documented
Adequate resources are provided to support the process
Approach Characteristic
Practices are more complete or advanced than at MIL1
MIL3
Management Characteristics
Activities are guided by policies (or other organizational directives)
Responsibility, accountability, and authority for performing the practices are assigned
Personnel performing the practices have adequate skills and knowledge
The effectiveness of activities in the domain is evaluated and tracked
Approach Characteristic
Practices are more complete or advanced than at MIL2
- 10 -
11. Maturity Indicator Level Considerations
• Maturity Indicator levels apply independently to each domain
• MILs are cumulative within each domain
• To achieve MIL3 in a domain, the organization must perform the MIL1, MIL2,
and MIL3 practices in the domain
- 11 -
12. Model at a Glance
MIL3
MIL2
MIL1
Asset
Threat
Risk
Access
Situation
Response
Third-Parties
Workforce
Architecture
Program
Three maturity indicator levels (MIL): Defined progressions of practices
MIL1 Practices for RISK domain
10 Model Domains: Logical Groupings of Cybersecurity Practices
- 12 -
13. C2M2 Architecture – ACCESS Domain
Example
Domain Name
Domain Objectives
Practices organized
by objectives and
MILs
- 13 -
14. Using the Model
• Perform a Self-Evaluation: determine the
implementation of cybersecurity activities
within the organization
• Analyze Identified Gaps: determine whether
the gaps are meaningful and important and
should be addressed
• Prioritize and Plan: prioritize the actions
needed to fully implement the practices to
achieve the desired capability
• Implement Plan: implement the plans defined
in the previous step to address the identified
gaps
- 14 -
16. Self-Evaluation Roles
Number Role Names
1 Sponsor
2 Organizer
3 Facilitator
4 SMEs
5 Observers
6 Scribe
7 Support Staff
8 Participants
- 16 -
17. Defining the Scope of the Self-Evaluation
Identify the function in scope for the self-evaluation
• “Function” is the part of the organization to which the C2M2 is applied
• Consider the level of similarity of security management when selecting a function
• Consider the systems and organizations that share interdependencies with the
function
A self-evaluation for the in-scope function would include
• Assets important to the delivery of the function
• Assets within the function that may be leveraged to achieve a threat objective
• Other potential assets within the function (e.g., data in the cloud, mobile devices)
• Third-parties that have access to, control of, or custody of the function’s assets
• Third-parties upon which the function depends
- 17 -
18. Agreed-Upon Function and Scope
• Placeholder for the organization performing the self-evaluation to describe
the organization’s identified function in-scope for the self-evaluation
- 18 -
19. Self-Evaluation Process
• The facilitator will assist the group in reaching consensus on responses
• Select the response that best matches the level of implementation for each
practice
• If consensus cannot be reached, moving on and returning later may help
facilitate consensus
- 19 -
20. Self-Evaluation Tool
• The self-evaluation tool will not
create a self-evaluation report unless
a response is selected for every
practice
• If the organization has chosen to
not implement a practice, select
Not Implemented
• Use the notes field to capture
important information (e.g., rationale
for responses) that may be of use
later in explaining self-evaluation
results
- 20 -
21. Response Options
Response Description
Fully Implemented (FI) Complete
Largely Implemented (LI) Complete, but with a recognized opportunity for improvement
Partially Implemented (PI) Incomplete; there are multiple opportunities for improvement
Not Implemented (NI) Absent; the practice is not performed by the organization
- 21 -
22. Selecting Responses
• Consider key phrases in the practices
• Examples of key phrases
• “periodically and according to defined triggers”
• “is established and maintained”
• Use the clickable pop-up definitions of key terms or the glossary to help
answer specific questions
• Use the help text to help answer specific questions about practices
• Responses should reflect the current state of the organization
• Consider documenting in-progress or planned projects in the notes field
• Consider updating your self-evaluation when in-progress or planned projects
are complete or at a set period, such as every six months
- 22 -
23. Asset Types Included in the C2M2
Asset: something of value to the organization
• IT asset: hardware and software
• OT asset: hardware and software
• Information asset: information essential
for operating the function
Consider these three types of asset when
C2M2 practices refer to “assets.”
Examples of Assets
IT Assets OT Assets Information Assets
• Workstations
• Switches, routers,
and firewalls
• Servers
• Virtual machines
• Software
• Workstations
• Switches, routers,
and firewalls
• Servers
• Virtual machines
• Software
• Programmable
Logic Controllers
• HMI Stations
• RTUs
• Actuators
• Sensors
• Business data
• Intellectual
property
• Customer
information
• Contracts
• Security logs
• Metadata
- 23 -
24. Ad Hoc Practices
• By design, MIL1 practices may be implemented in an ad hoc manner and still be
considered Fully Implemented.
• Performance of ad hoc practices may depend on the initiative and experience
of an individual or team, without much organizational guidance (e.g., policy or
procedures)
• The quality of the outcome may vary significantly depending on who is
performing the practice and other factors such as the methods or tools used
• Lessons learned may not be captured and outcomes may be difficult to repeat
• Institutional knowledge is not captured in documented policy, procedures, or
work instructions
• The practice may not be retained over time or under times of stress
• Implementation of ad hoc practices, like any other practice, may result in
documented artifacts (e.g., asset inventories, a cybersecurity program strategy)
- 24 -
25. Practice With Dependencies
• There are several practices that, depending on how they are scored, inform
the logical response for certain other practices.
• Practices that contain dependencies reference the related practice in
parentheses within the text of the practice.
• For example, THREAT 2j is dependent upon SITUATION 3g. THREAT-2j states
“Threat monitoring and response activities leverage and trigger predefined
states of operation (SITUATION-3g).”
• If SITUATION 3g is scored as Not Implemented (i.e., the organization has not
identified predefined states of operation), then THREAT-2j should not be
scored as Largely Implemented or Fully Implemented.
- 25 -
26. Practices with Compound Statements
• If some but not all activities in a practice are implemented, the response
should reflect the level of implementation (i.e., it should not be considered
Fully Implemented)
• For example, ARCHITECTURE-1b states “A strategy for cybersecurity
architecture is established and maintained in alignment with the
organization’s cybersecurity program strategy (PROGRAM-1b) and enterprise
architecture”
• If the strategy exists but is not routinely updated, Largely Implemented
should be considered instead of Fully Implemented
- 26 -
27. Practice Progressions
• The model includes sets of related practices that form increasingly complete,
comprehensive, or advanced implementations of an activity
• These are called practice progressions and are noted in the help text
• These practice progressions mostly start with ad hoc practices in MIL1 that
serve as a foundation upon which the other practices build
• When evaluating a practice in a progression, implementation of prior
practices in the progression should be considered
- 27 -
28. Practice Progression Example
ASSET-1a, ASSET-1b, ASSET-1f, and ASSET-1g form a progression.
These practices are about the IT and OT asset inventory being created and growing in
completeness and accuracy.
- 28 -
30. Sample MIL Achievement by Domain
• A summary bar chart provides a quick
indication of the MIL achieved in each
domain.
• In this example:
• MIL3 has been achieved in ACCESS.
• MIL2 has been achieved in
SITUATION, WORKFORCE, and
PROGRAM.
• MIL1 has been achieved in all other
domains.
- 30 -
31. Sample Summary Score MIL1
MIL1
MIL2
MIL3
There are 3 practices at
MIL1 for the SITUATION
domain
The colors and numbers in the ring summarize
implementation level of those practices; in this
case, 2 practices at MIL1 are Fully Implemented
and 1 is Largely Implemented
- 31 -
32. Sample Summary Score MIL2
MIL1
MIL2
MIL3
There are 16 cumulative
practices at MIL2; 3 at
MIL1 and 13 at MIL2
2 practices are Fully Implemented
9 practices are Largely Implemented
5 practices are Partially Implemented
- 32 -
33. Sample Summary Score
• This figure shows summarized results for every practice, grouped by domain
• To achieve a MIL in a domain, all practices in that MIL and in all preceding
MILs must have responses of either Fully Implemented or Largely
Implemented
- 33 -
34. Summary of Management Activities
• Management Activities
focus on the extent to which
cybersecurity practices are
institutionalized, or
ingrained, in the
organization’s operations
• This chart offers an
overview from two
perspectives:
1) implementation within
each domain
2) implementation across
the 10 domains
- 34 -
#3:Placeholder discussion topics
Sets the expectation for the workshop
Roles of the participants
#4:Placeholder discussion topics
Emphasis the importance of the effort and accuracy of responses
Discuss how the results will be used within the organization’s overall cybersecurity program
Note that next steps will be based on the organization’s risks and maturity
Highlight that the output of the C2M2 self-evaluation should drive risk conversations and allow organizations to plan periodic reviews of their cybersecurity program to track progress and validate goals.
Point out the roles of participants in follow-up activities.
#16:Placeholder for listing roles involved in the Self-Evaluation