CISO Scorecard Example
CISO Scorecard Example
How do you answer the boss’ “how secure are we” question? Quantifying what really matters in cyber
security is a first step in deciding what to measure, thus develop any sort of scorecard. The current cyber
industry trend is towards a risk based security strategy, to best apply resources where the impact
reduction is the greatest. Then what matters becomes what protection capabilities and vulnerability
mitigations add the best value in reducing the organization’s overall risk. These risk reduction
capabilities should be based on the organization’s key business success factors and support both threat
vector minimization and vulnerability impact reduction as well as enhanced competitive advantage.
Thus any “CISO scorecard” represents the overall “best risk value” for the organization from those key
views. Whereas the ‘why’ of collecting security metrics also includes justification for security spending,
compliance and regulations support, data breach risk minimization (loss of brand / reputation), etc.
Many of downsides of ineffective risk management have hard to quantify intangible effects, specific
damages potentially incurred, cost avoidance, etc – yet in the aggregate are very costly.
Ideally the organization has documented business success objectives to help determine the related
technical objectives by which the security controls are applied. If not, then one needs to use the vision
and mission statements to distill a set of operational success objectives to use in translating those
requirements to technical objectives. After which a risk assessment is applied and security controls are
determined. In many cases this is the hardest step, as it involves two transformations and then mapping
the major controls thereafter, with frequently fuzzy objectives to start with. Nevertheless, the security
strategy must be risk value based, so this is a great exercise for the Security team to undertake to ensure
their major cyber tasks and resources are focused on the most effective risk value elements. In fact, the
ultimate security objective is to provide a competitive advantage, differentiating cyber strategy, which
includes minimizing risks. The following sections will describe how we distill the core risk factors and
major focus areas to develop the CISO scorecard, whereas as the initial notional views suggested above
are suggested as below, to help frame the bigger picture as we start the ‘what matters’ journey.
Figure 1.0 – Notional risk value elements
Next we apply these major risk reduction and competitive advantage elements to the company’s
strategic plan illustrated below and then distill the major business factors to base our CISO scorecard on.
Figure 2.0 – Company strategic business factors (example - insert YOURS here!)
Using the suggested risk reduction (and differentiating) measures quantified later in the document,
along with these strategic plan objectives, we propose the key operational business success objectives
that have a corresponding technical objectives relationship. The most likely risks and barriers to those
objectives are then assessed by which the security measures will be selected to make up the CISO
scorecard. These measures must be relatively easily captured and processed, to both minimize the
resources overhead and support a clear picture of the ‘best risk value’ outcomes.
Scorecards are used for strategic decision support – especially for financials and operations. For business
decision support, scorecards are created to present targeted answers to common leadership questions:
Are we meeting our fiduciary and statutory requirements?
How do we compare to our peers? Are we advancing our objectives?
How are we identifying and managing risk? Are we improving our risk posture?
Are we investing in and advancing initiatives in the right order?
A Cyber Security Scorecard will help steer an organization towards the desired cyber security strategy,
while providing answers to both the leadership questions and others raised by executives:
Board of Directors Questions
1. What is the status of our cyber resilience capabilities compared to the threat level?
2. What is the impact that cyber security risks have on our business strategy, risk posture?
3. How do our measures and investments compare to the rest of our sector?
4. Are we compliant with the relevant cyber security and related regulations?
CIO Questions
1. What are the key drivers in cyber security risk management and how are they developing?
2. What is the status of our cyber security detection and prevention capabilities?
3. What is the status of the risk framework and processes, including compliance / audit / V&V?
4. What were the root causes and actions taken for any high-impact incidents in last period?
Ultimately, the Security scorecard objective is to help CISOs be more successful at communicating the
business value of information security and at linking the strategy with execution. A pseudo-formula for
how to do it: Strategy Map + Measures and Targets + A Set of Funded Initiatives = A Complete Program
of Action. Table 2.0 outlines the mapping between the company objectives and most likely risks (as
listed in table 1.0 and further categorized later). Whereas the next section assesses the various security
measures suggested by several authoritative sources (plus other operational metrics in the appendix) to
best capture and align to those risks. We note upfront that that there is one over-riding risk that is
pervasive in negatively affecting all objectives – a data breach. Which should not be a surprise to
anyone, given the record numbers of events in 2016, in files, organizations and costs. Security developed
a separate data breach risk posture approach that assessed the top ten vulnerabilities and risks therein,
In table 1.0 the major data breach causal factors are listed as ‘methods’ since they contain the typical
mix of policy, process, people and product (technology) elements. To start the risk estimation journey,
especially for those who have only a general awareness of the security methods, we first start with a
high level sense of the relative ‘goodness’ level, providing a heuristic view of the security ‘posture.’ The
‘key areas / exposure’ column (not shown below) provides a high level view of the major residual
problem areas to help link the relationship and potential impacts in the other methods. The relative
posture level and numeric risk value estimates for each method can provide a sense of the impact
reduction potential for minimizing the vulnerabilities risk level, taking into account the defense in depth
status of the system architecture as a whole (the dependencies, hierarchy, inheritance, etc. at play).
(note - risk values are notional, examples used for illustrative purposes, NOT actuals, insert yours here!)
Table 1.0 – Main Data Breach methods and mitigations
Methods Posture Risk Mitigations (common objectives)
(1-25)
Risks depend on the threats ability to take advantage of existing vulnerabilities. Whereas a risk based
security strategy should focus on the highest threats to the company, specifically the “threat actors”
that can take advantage of existing vulnerabilities; thus create the highest risks. These ‘intruders’ threat
vectors also need to be minimized and accounted for in the metrics we capture, be they scorecard or
operational. The typical threat sources that will be accounted for in the risk value process are:
(1) - Insiders (85+% of all attacks);
(2) - Privileged users (stolen credentials cause most data breaches);
(3) - Hackers (nation states, criminals, all malevolent security types - typically they use malware), and
(4) – Inattentive Users (phishing attacks, poor social media behavior, poor PWs, etc.).
Table 2.0 aligns the operational objectives with the technical aspects therein and the corresponding key
risks and barriers.
Table 2.0 – Business Objectives and Risks Mapping
Operational objectives Technical aspects Key Risks & overall barriers
A. Grow market share Demonstrate Safety, Service, A, C and F.
Integrity & Trust to show clear Unclear common companywide
value. Overall Industry ecosphere objectives
approach, versus just clients and No integrated capability
customers, maintain visibility…. roadmap
B. Sector Leader Enable Innovation, Establish B, D and E.
Standards, support ISAC, support Limited R&D mapping to future
research and bold / disruptive client’s needs
changes for the industry – leapfrog Limited standards efforts to set
legacy obsolesce…. the industry baseline
C. Drive efficiency Lean processes & remove overlap, B, C and E.
key metrics & data driven Limited companywide process
decisions, smart monitoring and improvement initiatives
Audit / V&V…. No common “CM” or metrics
processes or champion
D. Company R&D drives Special Project, Data / predictive B, C and F.
capability transformation analytics, Cyber Safety, effective Limited top down ERM focus
standards implementation, whole Unclear capability value chain
company Risk Value view….
E. Company divisions Common processes, user A, E and F.
synergies to improve productivity focus, business centric Unclear harmonization of overall
product / services IT/security, new markets, company success factors
cohesion technology agnostic solutions…. No ‘commodity’ view of core
capabilities key attributes
The NIST process to develop security metrics is shown below for another process frame of reference.
Even with this high level illustration, it’s clear that it is an extensive process in itself, whereas our
development and execution resources are fairly constrained in addition to not having any formal,
overarching corporate metrics processes to align within. We will account for these steps as we begin to
socialize and phase in the implementation of the recommended CISO scorecard metrics.
Figure 3.0 – NIST Metrics Life Cycle
Effective metrics should have five qualities. Most organizations know how to pick metrics that satisfy the
first four qualities: namely, (1) that they are expressed as numbers, (2) have one or more units of
measure, (3) are measured in a consistent and objective way, and (4) can be gathered efficiently
(cheaply). Yet only a few entities choose metrics that satisfy the most important criterion: contextual
relevance. That is, the metrics must help someone -- usually the boss -- make a decision about an
important risk or business issue. Too many organizations use metrics to erect byzantine processes
(excessively complicated, involving a great deal of administrative detail) to measure the minutiae of
The CISO DRG key metrics (chapter 5 therein) are listed below (numbered for later reference, not
priority) where those in bold text are the initial metrics proposed (phased in there as well):
Administrative metrics:
1---Legal – (a) - Percentage of material contracts (defined as involving data sets that require privacy and
security controls) that have been evaluated by security (typically as a high level risk assessment).
(b) - Percentage of material contracts (defined as involving services with sensitive information that could
impact the core company operations) that have been evaluated by security, including requirements for
breach notification and confidentiality language.
2---Financial – (a) - Percentage of IT budget allocated to security (where this value varies widely
depending on industry and the cyber baseline effectiveness - a minimum of 5% is normally expected).
(b) - As applicable, status of major security projects for the year. (added)
3--- Human Resources (HR) – (a) - Percentage of employees who had a thorough background check
(including any previous questionable activities, be that illegal or unethical).
(b) - Percentage of job descriptions that highlight the employee’s responsibility to protect the
organizations’ assets.
(c) - Percentage of employees who have taken the annual security awareness training (and ideally
passed an assessment of their retention of key cyber safe tenets)
(d) - Percentage of employees who have read, acknowledged and been tested on the security policy.
4 ---Vendor Management – (a) Percentage of material vendors (defined as those that have the potential
for critical or severe risk, should a breach or service interruption occur) who have been audited either
directly or a third party (e.g., SSAE 16 SOC 1 & 2 audit, etc.).
(b) - Percentage of material vendor relationships that are inventoried and documented.
(c) - Percentage of material vendors that participate in quarterly reviews with the security function.
Operational Metrics:
5---Software Inventory – (a) – Existence of an IT Asset Management database (“ITAM”) (added).
(b) - Percentage of known assets & ‘systems’ accounted for in “ITAM/CMDB.”
6---Information (data) Inventory – (a) Percentage of information assets accurately inventoried
(b) Percentage of information classified accurately (assumes a data classifications standard exists)
(c) – Percentage of system documented (e.g., at least a data flow diagram exists, ideally a security plan)
7--- Systems Upgrades & Patching – (a) Percentage of major systems (both hardware and software) that
are still supported by the manufactures’ or validated third party.
(b) – Percentage of systems that are patched within 30 days of critical security patches.
(c) – Percentage of system that are scanned for vulnerabilities on a monthly basis
8 – Multi-Factor Authentication (MFA) – (a) percentages of systems with IP, PII, or other sensitive data
that use MFA
(b) – Percentage of critical capabilities (Servers, firewalls, etc) & remote connections using MFA. (added)
9 – Mean time from incident response to remediation – Time delay from detection to vulnerability fix.
10 – Aggregate threat level – composite score of the leading cyber threat intel (CTI) indicators. (added)
Governance Metrics (compliance):
11 – Incident Response (IR) Plan – (a) Existence of a management reviewed and approved IR Plan
(b) – Time since IR plan was tested (at least in a table top, ideally a short breach exercise yearly).
Next we list the CIS top 20 security controls metrics recommendations (e.g. below table 3.0, 28
measures overall) as one key reference, followed by their recommendation for the top ten metrics to
use in a BSC. These CIS metrics will be harmonized with the CISO DRG metrics recommendations above
to provide our ‘best of breed” key risk reduction and competitive advantage BSC metrics, which maps
back to the major business objectives; and can show a cost benefit where applicable as well.
Table 3.0 – CIS Security Controls Key Metrics
Function Management Perspective Defined Metrics
Incident Management How well do we detect, accurately 1. Cost of Incidents
identify, handle, and recover from 2. Mean Cost of Incidents
security incidents? 3. Mean Incident Recovery Cost
4. Mean-Time to Incident Discovery
5. Number of Incidents
6. Mean-Time Between Security
Incidents
7. Mean-Time to Incident Recovery
Vulnerability Management How well do we manage the 8. Vulnerability Scanning Coverage
exposure of the organization to 9. Percent of Systems with No Known
vulnerabilities by identifying and Severe Vulnerabilities
mitigating known vulnerabilities? 10. Mean-Time to Mitigate Vulnerabilities
11. Number of Known Vulnerabilities
12. Mean Cost to Mitigate Vulnerabilities
Patch Management How well are we able to maintain 13. Patch Policy Compliance
the patch state of our systems? 14. Patch Management Coverage
15. Mean-Time to Patch
16. Mean Cost to Patch
Configuration Management What is the configuration state of 17. Percentage of Configuration
the systems in the organization? Compliance
18. Configuration Management Coverage
19. Current Anti-Malware Compliance
Change Management How do changes to system 20. Mean-Time to Complete Changes
configurations affect the security of 21. % of Changes with Security Reviews
the organization? 22. Percent of Changes with Security
Exceptions
Application Security Can we rely on the security model of 23. Number of Applications
business applications to operate as 24. Percent of Critical Applications
intended? 25. Risk Assessment Coverage
26. Security Testing Coverage
Financial Metrics What is the level and purpose of 27. IT Security Spending as Percent of IT
spending on information security? Budget
28. IT Security Budget Allocation
Discussion - So now what? We’ve listed the CISO DRG overall measures and the CIS top 28 and BSC 10
measures and those risk based factors in the previous table that we proposed from the company’s
business strategy. These measures all need to be harmonized into a ‘best of breed’ metric set and then
associated to the four BSC categories described earlier. How do we reconcile them and provide
performance indicators to senior leadership and then also decide which business outcome is weighted
more - risk reduction (cost avoidance) or innovation (increase revenue). Including a competitive
advantage aspect, in adding to the overall business success objectives value? In addition, the security
team has to be able to effectively collect these metrics and transform them where required into the key
leadership level interest items. Then we need to be able to benchmark the key measures, or use some
other well-known company comparison methods, to show relative goodness. These risk focused areas
then serve as the foundation for identifying the most relevant measures to be considered on a
company’s dashboard. They cover the core areas of cyber security: risks, compliance, incidents,
awareness & culture, threat level, maturity state, and key cyber security projects in development. (note
- for more examples see:)
https://assets.kpmg.com/content/dam/kpmg/pdf/2015/04/Cyber-Security-Dashboard1.pdf
http://www.securitymetrics.org/content/Wiki.jsp?page=Chapter10ScorecardDesign
So how do we define “cyber success” using the common BSC format that also demonstrates the security
value sphere; which in turn support the company’s core business model, vision and mission? That is,
aligning the CISO scorecard to show a triad of key contributions: (1) competitive advantage (facilitate
new business, enhance reputation, establish collaborative partnerships), (2) operational efficiency
(reducing security and compliance costs, increase productivity) and (3) risk reduction (cost avoidance,
minimize the number and impact of security events).
The earlier table mapped out our business objectives into technical objectives and main associated risks;
thus we use those to frame our selection of CISO scorecard metrics. In addition the other day-to-day
security metrics we recommend (listed in the appendix) that have an operational utility also need to
show the value of supporting the scorecard view. Security adopted the RBSS approach and uses that to
prioritize our tasks and resources; thus it follows that this approach can also be used to select the best
value metrics for the targeted risks in the table. Given there are numerous ways and frameworks to map
metrics to risks, we start by choosing the CISO DRG metrics to start, while also integrating the CIS
metrics recommendations where needed into a core, best of breed, measures set, then map those
metrics to the targeted risks. This approach provides scorecard traceability to the key business
objectives and provides management with ‘data driven’ decision support to obtain the best risk value
while also enhancing the company’s competitive advantage.
Next we align the main risks with the corresponding metrics and the data they provide to assist in
decision support. The table provides a link to the risks, suggested metrics and decision support data.
Table 5.0 – Risks / Metrics / Decisions supported
Risk themes Associated Metrics Data / Decision support
A – Data 1.A – Material contacts assessed - Potential third party privacy / data break risks
protection 2.A - % IT budget for security - Resource the critical protections for this risk
3..C - % employees annual training - Significant ‘user threat’ vector for this risk
4.A - % Vendors / contractors assessed - Inherited security posture on their side
6.C - % systems documented (data flows) - Lack of documentation means more data risks
11 – Time period since last IR Plan test - Speed / accuracy of damage control is critical
14 - % projects with risk assessment (SP) - Lack of security plan means unknown risks
16 - # cyber incidents / cost to remediate - Precursor to follow on data breach costs
B - IAM 2.A - % IT budget for security - Resource the critical protections for this risk
3.A - % employees background check - Critical ‘insider threat” propensity attribute
8.A - % sensitive data systems using MFA - Stronger access controls for more data security
8.B - % critical IT systems using MFA - Improved protection of essential infrastructure
C – Vulnerability 1.A – Material contacts assessed - Potential third party privacy / data break risks
Mgmt 2.A - % IT budget for security - Resource the critical protections for this risk
4.A - % Vendors / contractors assessed - Inherited security posture on their side
5..B - % key assets / systems in ITAM/CMDB - Ability to prioritize, quantify and track risks
6.C - % systems documented (data flows) - Lack of documentation means more data risks
7.B - # systems patched within 30 days - Risk impact grows with time and exposure
7.C - % key systems monthly scan - Verify & validate remaining vulnerabilities / risk
9 – Mean time from detect to remediation - Risk impact grows with more time exposure
10 – Aggregate threat level (and trends) - Focus / tune tools for increased threat actors
13.B – BCP/DR – time since last test - Resiliency risks increase with time exposure
14 - % projects with risk assessment (SP) - Lack of security plan means unknown risks
16 - # cyber incidents / cost to remediate - Shows where vulnerability management is weak
Any CISO Scorecard effort also needs to link those metrics into their security strategy of course and do
that in one slide too! We offer proposed IT Security strategy statements below, which should capture
the security value triad intent, and an example strategy slide. Once the BSC based measures are finalized
that clarify the security value triad, then update this slide, whereas this proposed draft serves to help
frame the final scorecard recommendations. (again, an example, notional strategy, put yours here!)
Process note - Developing a CISO Scorecard (or any type for that matter) requires quite a lot of effort to
get the final formalized and approved metrics and format (for example, see appendix item “D” / SANS
Scorecard checklist, with over two dozen recommended steps). Our scorecard efforts start with this
research / community paper to set the background, provide key references with potential
recommended security metrics, etc., to provide an initial “strawman / notional” output to facilitate
those steps and discussions required in the checklist. It’s rather like a ‘rapid programing’ effort where
the general intent is known, along with best practices therein, so a baseline is developed from those key
sources, wherein it is then generally to easier verify requirements, et al. Especially since many in
management don’t have a full grasp of what security does or the high potential damage that a data
breach risk can cause, for one example; thus showing a sample end-state scorecard can help facilitate
their understanding and concurrence on the utility and value of the whole effort.
The operative question then becomes, after all the options, mapping and assessment so far, which
metrics do we start with and what format should we use? Given we do not have a formal metrics
program approved yet and we’ll have a long road to complete the checklist steps and educate
management, a simple and short format is best. The risk vernacular is well known to all management,
results based on impact to business success objectives; thus a natural format to start our CISO Scorecard
is using the risk theme. In this case we’ll start with using the top four risks (less the policy and
management ones) to tell the current cyber story, using an example ‘risk heat map’ illustration, followed
by our proposed CISO Scorecard. This two slide communication method both sets the stage on what
matters to the organization and provides a snapshot of the metrics that move the risks to lower levels.
We then translate the table into a C-Suite / BoD level set of graphics, as ‘presentation’ counts!
A---NIST Special Publication (SP) 800-55, Performance Measurement Guide for Information Security
http://csrc.nist.gov/publications/nistpubs/800-55-Rev1/SP800-55-rev1.pdf
1 – Security budget
2 – Vulnerability management
3 – Access control
4 – Awareness and training
5 – Audit and accountability
6 – Certification and accreditation
7 - Configuration management
8 – Contingency planning
9 – Identification and authentication
10 – Incident response
11 – Maintenance
12 – Media protection
13 – Physical environment
14 – Planning
15 – Personnel security
16 – Risk assessment
17 – Systems and services acquisition
18 – Systems and communications protection
19 – System and information integrity
(see also NIST 800-100, Section 7.0 (summarizes 800-55))
B---CIS top 28 metrics suggestions (based / mapped on their CIS top 20 controls) (listed in main body):
https://benchmarks.cisecurity.org/tools2/metrics/CIS_Security_Metrics-Quick_Start_Guide_v1.0.0.pdf
The CIS Measurement Companion to the CIS Critical Security Controls provides specific information
sets and suggested thresholds to measure over 90 of the CSCs; these will be used to execute and
implement the final recommended metrics set that supports our BSC view.
https://www.cisecurity.org/critical-controls/
In addition, SANS has a CIS 20 CSC based paper on metrics for each of the controls, to assist in execution:
https://www.sans.edu/student-files/projects/jwp-caincouture-whitepaper.doc
All security metrics must support key organizational goals and objectives, as listed earlier; thus working
backwards to quantify the specific data that can be easily captured at some level and then used to
support decision making, resource allocation, etc. For “COMPANY” we propose the following top down
measurements, which also must support a balanced scorecard (BSC) view. We start with three major
areas to report on: (1) key residual risks and impacts, (2) business value supported and (3) the overall
security state. These can then be further defined with suggested supporting data elements.
(1). Top technical security risks, which then must be put into a business impact view; (a) weak and
inconsistently enforced IdAM (CIS #5, 9, 14, 15 & 16), (b) minimal ITAM / CMDB process and adhoc
release management (CIS # 1,2, 3, & 11), (c) intruders & malware (CIS # 6, 12, 19), (d) unclear privacy
and data protection methods (CIS # 10 & 13), and (e) weakly correlated security products (CIS
#20)(minimizing the ability to quickly spot an intruder). These must all be quantified in terms of business
impacts.
(2). Business value as measured against the company’s key success factors. (… starting with the company
objectives listed earlier, then list / map them…). Translate the key security priorities into supporting
those factors, as well as enhanced productivity overall and added competitive advantage.
(3) Overall state of security. This high level cyber metric can be overly complex if not confined to
‘security data that matters’ (e.g., the voluminous IT/security logs kept) and founded on ‘outcome’ based
metrics. Areas to include at some level are: Cyber vulnerabilities, threat report, Security projects status,
and process compliance (and maturity), again as mapped into a BSC view showing risk value and
competitive advantage.