0% found this document useful (0 votes)
12 views

Public-Private Partnership and Cybersecurity For Critical Infrastructure

This document summarizes a panel discussion on defining success and mapping the road ahead for public-private partnerships in critical infrastructure cybersecurity. The panel discusses what success would look like in this domain, how current efforts are doing, and what needs to be improved. In interviews with industry and government officials, the panel found that defining success was difficult but may include continuous improvement, resilience against catastrophic failure from cyber attacks, strong capabilities within firms, collaboration between industry and government, and collaboration among firms. Trust between private firms and between firms and government is seen as crucial.

Uploaded by

Omar Gonzalez
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views

Public-Private Partnership and Cybersecurity For Critical Infrastructure

This document summarizes a panel discussion on defining success and mapping the road ahead for public-private partnerships in critical infrastructure cybersecurity. The panel discusses what success would look like in this domain, how current efforts are doing, and what needs to be improved. In interviews with industry and government officials, the panel found that defining success was difficult but may include continuous improvement, resilience against catastrophic failure from cyber attacks, strong capabilities within firms, collaboration between industry and government, and collaboration among firms. Trust between private firms and between firms and government is seen as crucial.

Uploaded by

Omar Gonzalez
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 38

MIT Center for Intnl Studies | Public-Private Partnership and Cybersecurity for

Critical Infrastructure

SEAN ATKINS: All right. Good afternoon and welcome to the Center for International Studies' panel

discussion on Defining Success and Mapping The Road Ahead for Public-private
Partnership and Critical Infrastructure Cybersecurity. I'm Sean Atkins, the host for

the event and a candidate here at MIT's Political Science Department, where I co-

lead a research project on this topic with Professor Chappell Lawson.

So if you've taken the time to register and join us today, it's probably not news to

you that critical infrastructure cybersecurity is a topic that gets a lot of attention.

And while some of this attention borders on-- or maybe sometimes crosses the line
into-- the realm of alarmism, there is real reason to take critical infrastructure

cybersecurity seriously.

Among a growing number of incidents, the US has already seen foreign cyber
operators conduct significant disruptive attacks on its financial services sector and
virtually place themselves at the controls of some electricity distribution points. Last

year's National Intelligence Threat Assessment noted that some cyber actors were
postured to disrupt US natural gas pipelines for a period of weeks, and others were

mapping out a number of other critical systems in order to be able to cause


substantial damage. The trend line, in terms of both scope and scale of threats,

suggests that this really isn't going to get any easier for us from here.

And to address this increasing risk, the US has largely relied on a public-private

partnership approach since at least the late 1990s. And this makes sense,

considering that the vast majority of this infrastructure, from financial services to

pipelines and from the power grid to telecom networks, is owned and operated by
private firms. Government-industry partnership is not just a good idea-- it's

essential.

But in practice, the public-private partnership approach has evolved significantly

over the last 20-plus years, often in response to the continuing flow of emerging

threats, realized vulnerabilities, and changes in technology that affect both of


these. The result has been a patchwork of policies across and within critical

infrastructure sectors. And something that becomes evident when you take a step
back to look at this evolution is that we haven't really established a clear definition

of what "good" looks like. We haven't defined what success is in this approach or
how we might tell that we are getting closer to it or missing it altogether.

And to be clear, there have certainly been significant successes as well as


instructive failures along the way that indicate where things are working and where

they are not. These offer insight that might be leveraged to sketch out a usable

definition of success that can guide policy-- something that gets us beyond broadly
conceived ideas of information sharing and cooperation between firms and with the

government. So this is a critical conversation to have and one that has been long in

the making.

It is also a conversation that I think today's panel is uniquely equipped to contribute

to. Many of you probably recognize some of the panelists. And if you take a look at
their bios, which will be posted in chat here, you can see that they bring to the table

deep experience on this issue in both industry and government, as well as in


practice and policy. I'd also like to highlight that they have individually worked on

different sides of the fences for this issue, personally bridging across the different

worlds that each have an important part to play in addressing what is truly a

multidimensional problem.

So from here, Professor Chappell Lawson is going to set up and then kick off our
panel discussion. And then following that conversation, which should take about 45

minutes or so, we'll open it up for questions. Thanks again for joining us. I hope you

enjoy and find the discussion useful. And with that, I'll hand it off to Professor

Lawson. Chap, it's all yours.

CHAPPELL Thanks so much, Sean. And I hope you all won't mind if I share my screen here to

LAWSON: give you a sense of what we're going to focus on today once I stop talking and turn
it over to our panel. As Sean mentioned, the focus is really on cyber threats to

continuity of operations in critical infrastructure. Cybersecurity is obviously an

enormously capacious term, and many people at MIT and elsewhere have lots of

opinions on it from different angles. But this is our focus today.

And as Sean suggested, we're going to address three questions. The first is, what is
success? Or what are we trying to get to in this realm? The second is, how are we
doing? That is, now that we know what success is, what will we rate our efforts to

date? And then, what do we need to improve? That's the third question.

I should just say, why not dive into three? As many conversations do, sometimes it's

just nice to know where you want to go before we start changing course and making
recommendations. So the proximate motivation for this seminar, from MIT's

perspective, was that Sean and I've been doing a lot of research, interviewing

people who are senior executives in the private sector or senior government

officials or former government officials on all of these questions.

And one of the most arresting things that came out of those conversations was how
stumped people seemed to be at first when we asked them, what is success? And

they came up with some of the answers that you see there on the screen. And we

gave our interviewees monikers. So "Jerusalem 2019" was somebody we

interviewed who was on the Hill and gave us the impression that this was a difficult

question and in many ways, much more problematic and vexing than similar

questions might be in other domains of public policy.

For instance, it's not the kind of response one gets if you ask about environmental

regulation or labor law protections or something like that. And even getting to

something as anodyne as the notion of continuous improvement in relationships

between the public and private sectors on cybersecurity took a little bit of prodding.

So what did we find after we did that prodding among the people who we

interviewed? And I'm not going to say that we necessarily agree with any of these
conclusions, but this was what jumped out at us from all of these conversations.

The first was kind of an acknowledgment that this is an enormously dynamic

environment with a lot of interdependencies, both informational and economic. And

so continuous improvement would be necessary for us to think we'd ever reached


anything like success in cybersecurity for critical infrastructure. And maybe also,

from some people, the notion that success was an indication that the sector was
becoming less likely to fail catastrophically as a result of the cyber attack.

But that wasn't all that we heard from the interviewees. We also heard sort of three
sub-definitions of success, I guess. One had to do with firms' capabilities. One with

collaboration between industry and government. And the third-- perhaps the most
important-- collaboration among different firms within each industry, including firms

who might normally be market competitors.

And I won't go through every element of these sub-definitions of success, but I will
just highlight one thing, which is the crucial importance that people mentioned of
trust-- trust between private firms and trust between private firms and the

government for building anything like a successful policy regime. So with that in
mind, by way of background only, let me just turn it over to our panel with that first

question.

What is the definition of success? If you were talking to the next head of CISA--

Cybersecurity Infrastructure Security Agency at the Department of Homeland


Security-- or somebody in a roughly analogous position in the White House, what

would you tell them they should be trying to get to in this domain? And I guess, let
me start out with Mark if I would.

MARK Thanks very much. Appreciate it. And appreciate the opportunity to be here. Look,
MONTGOMERY: I'll speak principally from a federal government point of view, being that the

Cyberspace Solarium Commission just did a lot of work on this. And we kind of
settled on the idea that the first thing that we had to do, no matter what our final

strategic approach was to critical infrastructure and cybersecurity, was that we had
to get the government organized properly-- because it hadn't been.

But if I was speaking to the CISA director, I'd give him or her three thoughts. First is
get the interagency performance and cooperation consistent and correct. The

interagency's inconsistent in how they support it. There's some agencies like
Department of Energy that really support the electrical power production and

distribution infrastructure well. Financial services are treated well.

But our water industry and its relationship with the EPA is not nearly as high-

functioning. And the budgets kind of support that. So this problem extends to the
Congress as well. So first is get the interagency consistent, consistently evaluating

risk across their sector. And then principally due to the inconsistency, but also a
little bit due to turf wars. So getting interagency performance correct.

The second of the three things would be start doing pre-planning to deal with
incidents and responses. You have to have done the pre-planning well in advance of
a significant cyber event or critical infrastructure impacting event. You have to have
built the playbooks and processes, and you have to have done this with the private

sector and the state and local governments. And then you have to exercise it at a
national level, kind of tabletop exercises. So start your planning.

And if it sounds like I just described the thing that should have happened before
COVID-- we did a white paper on just the analogy between COVID and cyber, and

they're both really nontraditional national security emergencies. And the failures
that most of us have noted at the front end of the COVID response will be replicated

in cyber if we're not careful.

And the third thing is you'll build resilience into your public-private collaboration. So

if you have that, you'll have success. And what that means is you've got to have a
vehicle for how you react, respond, and recover from an incident. And again, that's

what we call continuity planning. We'll talk about that in a future question. But I'll
tell you, get the interagency right. Start your pre-planning. Build resilience in your

public-private collaboration. To me, that's success from the federal government's


point of view.

CHAPPELL That's terrific. And Mark, do you just want to add anything about the structure of the
LAWSON: Cyber Solarium and the report? Maybe we could send a link to the report out in the

chat as well.

MARK The report was set up in the FY '19 National Defense Authorization Act by then-

MONTGOMERY: Chairman Senator John McCain, who was my boss. I worked for him on the Armed
Services Committee at the time. He didn't like commissions, but he used

commissions when he had a wicked problem that the Executive and Legislative
branch couldn't solve. But so when he authorized this commission, he gave us two

things. First he said, you got one year from when you start to when you're done. And
we took 10 months to finish our report. And then since then, we've been extended
but just to do legislative work.

The second thing he did was give us four legislators-- two senators, two

congressmen, Senators Sasse and Senator King, Representative Langevin,


Representative Gallagher. And having active legislators means it was fairly-- I

wouldn't say it was easy, but it was manageable for me to get a lot of our
recommendations into law. So we came up with a strategic approach to defend our
national infrastructure, democratic institutions, defense innovation base against a

significant cyber attack.

And McCain also said, hey, don't just give me a strategy. I want legislative and policy
remedies, heavily biased to action, which means legislation. So we came up with 82

recommendations-- 52 were legislative in nature, 30 for the Executive Branch alone.


The 52 legislative ones-- about 15 are for a future year, when a little more work's
been done on them. I tried to run 36 into the National Defense Authorization Act. 29

made it into the final conference and 25 are in this final bill. So we actually got 25
laws.

I would describe this National Defense Authorization Act as the most comprehensive
cybersecurity legislation this country's ever passed. Our CSC stuff's just a plurality.

It's not even a majority. There's about 79 cybersecurity-related provisions.

Some of the really big ones are ours-- like National Cyber Director, which I know
we'll talk about in a bit. But the commission set up by the NDAA then used the NDAA

to get things done. We'll do it again next year, and then we'll expire completely. But
the whole idea is to get legislation into action and be biased that way.

CHAPPELL That's terrific. Joel, let me see if you want to weigh in on the first question or
LAWSON: anything else that Mark said.

JOEL BRENNER: Yeah. I want to begin by saying that the Cyber Solarium report is an extraordinary
document that everybody who's interested in this area should study closely. We've
never had anything like it. And the Chairman Senator King and Congressman

Gallagher-- and Mark, having led this study-- really, everyone, the Republic is in your
debt. Of course, the hard work now is going to be to make it effective, to put it in
place.

But as for what we should be aiming at, I want to inject a different note. Notice how
procedurally-oriented all of the suggestions so far have been. On your slide, too,
Chap. Measuring oneself against how one used to be is a recipe for deception. If it

were good enough, General Motors would never have gone bankrupt because as
measured against themselves, they were continually getting better. But as
measured against Toyota and Volkswagen, they were continually getting worse.
So I think we need to really think about the state of affairs we want to be at. And I
believe there are five elements to that state of affairs and that this question
shouldn't be as hard as many people think it is. First, an attack on our critical

infrastructure would fail-- either in the sense that it wouldn't get through, that it
could be dealt with very quickly and without significant consequence-- and would be
punished.

Second element. IP theft would also bring punishing consequences in terms of


trade, which would require that the TRIPS Agreement under the World Trade
Organization-- which was a pre- world wide web agreement and which is terribly

obsolete, would have to be significantly amended and brought up-to-date to make


that happen. As of now, every country needs to have rules against IP theft in its own
country, but stealing IP from another country is not a violation of the World Trade
Agreement. This is insane.

Third, cyber crime would be held within tolerable limits. I don't say "eliminate it" any
more than we can eliminate automobile theft. But in tolerable limits. And that would

require at least two major things, one of which the Solarium report deals with. And
that is that bots will be readily identified and taken down. We can do that now, but
we're not doing it. The second element would be much better cooperation with the
CIS states-- that's to say, Russia and its close allies. I see no prospect of that in the
near future.

Fourth element-- liability for defective goods. Now that could be defined in a lot of

different ways, but the Solarium Report also suggests that. I won't go into the details
of that now. But liability is an important driver of behavior in a capitalist economy,
in a market economy.

And right now, this is the only area that I can think of in which introducing knowingly
defective goods into the stream of commerce has no liability consequences. It's not
a government regulation problem. It's just a general liability issue. It's very strange

that it's the way it is.

Fifth, we would have effective standards-- partially through regulation, partially

through government suasion over areas of critical infrastructure. Now I notice that
the Solarium Report introduces a new term in regard to critical infrastructure, called
"systematically critical infrastructure." Mark may want to comment on this. I think
it's an acknowledgment that our current definition of critical infrastructure, which
includes, I think, 17 sectors, has become so broad as to be nearly meaningless. And

this is an attempt to sort of tighten that down a little bit.

But that's what success would look like. If we had those things, gee! We'd be out of
work. That would be great. These are not procedural definitions. These are actual
aspects of a new world that we'd all like to live in. That's my answer.

CHAPPELL That's great. And I should say before I turn it over to Larry, it's boring if we all agree.
LAWSON: So if you have strong objections, I hope you'll weigh in.

JOEL BRENNER: I don't think-- not a group you have here.

LARRY So, is that a lead-in for me, Chappy?


CLINTON:

CHAPPELL That was.


LAWSON:

LARRY And now for something completely different. [LAUGHING] So my remarks emanate
CLINTON: from the premise that we are getting killed in cybersecurity. ISA has started a

program called "Rethink Cybersecurity." We think we need major change. We need


major structural change. We need major substantive change. We need major
attitudinal change. I'm going to get to some of those things later on.

But I want to start by going to Chappy's question, which is, what would success look
like? So in cybersecurity, we begin with risk assessment. So I think that success
would look like us finally developing a cybersecurity strategy that is in some rough

way equivalent to that of our adversaries. And I'm thinking specifically here of
China.

We need to develop a cybersecurity approach that is roughly equivalent in


thoughtfulness, in comprehensiveness, in integrated nature, and in support as our
friends in China have developed. They have developed, over a period of years, a
sophisticated digital strategy. The cyber strategy is woven into that. They are

thinking of this in much broader terms.


One of the problems we think we have with this is we are making no progress

because we are thinking of the entire thing far too narrowly. Our Chinese colleagues
have not done that. The Digital Silk Road, which is $1 trillion program, is combined
with the Belt and Road Initiative and is designed to frankly fundamentally reorder
the post-war Western-based system of process that we've been under. And they are

making some real success, which I'll cover coming up shortly.

When I look at Dick Clark's book, The Fifth Domain, he makes a very interesting

comment, wherein he says that we haven't fundamentally changed our


cybersecurity strategy since the Clinton administration. Bill Clinton left office 20
years ago. Things have changed pretty substantially in the last 20 years. But I would
agree with Clark and Knake-- we have not really changed our policy.

Basically, we are doing the same basic things. We're doing standards development
and information sharing, and frankly, not a lot more. We need to be working like our

Chinese colleagues have. They analyzed the digital world. They figured out how first
to leverage the vulnerabilities of the digital world to steal tons of intellectual
property, and leapfrog into competitiveness and even superiority with respect to
digital technology, largely by supporting their IT infrastructure much more than we
have.

So that's what success would look like. Success would look like our being as

sophisticated as our adversaries are being. And then we might be able to compete
in this space. Now aside from that-- and I'll get to structural change later on. And we
have some sympathy-- not sympathy, we have support-- for our friends at the
Solarium Commission, but we would go much further.

We think it is much, much too narrow an approach. But there are two things that I
would say before we get to kind of those policy and structural things. We need

attitudinal reform. We need an attitude adjustment. The public-private partnership


has largely been rhetorical in nature. It doesn't look really much different than the
other relationships that government have with industry. We need a real public-
private partnership.

The private sector are not stakeholders to the government. That's how the
government really thinks of us. We need to be partners. If anything, the partnership
looks much more like a child-parent relationship, where most in the government
sector think of the private sector as unruly children who need to be disciplined and
directed. That's not the case.

This is not Enron. This is not WorldCom. The problem we have with cybersecurity is
not really corporate misfeasance or malfeasance. There may be-- there probably
are some uncaring, lazy, greedy people in the private sector. There are probably

some of them in the government, too.

But that's not the problem. The problem we have with cybersecurity is that we have

an inherently vulnerable system protecting incredibly valuable data. So long as that


remains the problem, we're going to continue to have these attacks. We need to
understand that the bad guys-- the criminals, the Chinese, the Iranians, et cetera, et
cetera-- they are stealing consumer private information, corporate intellectual

property, national secrets.

We're all on the same side. We're doing far too much finger-pointing. Government

points at industry. Industry points at government. The vendors point at the users.
The users point at vendors. Media points at everybody. We need to understand we
are in this together, and we need a much, much more fulsome, equitable structure
and partnership to run our digital strategy together.

It's got to look much more like a good marriage, not a parent-child relationship. We
need to understand each other's unique differences. We need to pull ourselves

together and work as a true unity against this massive threat that we are seeing--
not just to our critical infrastructure, to our economic way of life, and the economic
way of life of the Western world. I can detail that as we go on later.

Second thing that we really fundamentally need is we need to understand that we


are thinking about this still as though this is a technology problem. It's not a
technology problem. There's a technical component to it, obviously. But this is not a

technology problem. The problem isn't that the technology is bad. The problem is
that the technology is under attack. That's a fundamentally different problem.

The reason it's under attack is because all the economic incentives favor the
attackers. It's not a vulnerability issue. All of our infrastructure is vulnerable,
incredibly vulnerable. Our transportation system's vulnerable. Our water system's
vulnerable. Our agricultural systems are vulnerable. Why do we never read about

these guys being attacked? Because there's no money there.

Cue John Dillinger. Why do you attack the internet? Because that's where the money

is-- trillions worth of money. And we are still looking at this as though it's a
technology problem.

Now by the way, when I say "we," I'm thinking mostly of our government colleagues.
In the private sector, virtually everybody in the private sector has done what our
government has not done. And credit to the Solarium Commission because they
move us a little bit in that direction. Digital transformation. We have not gone

through the sort of digital transformation-- as, by the way, the Chinese have-- that
we need to do in this government.

We need to understand that this is as much an economic incentive problem as it is a


technology problem. We are not going to make the internet invulnerable. We need
to work together to come up with economic solutions to this. And by the way, now
some people are going to say, oh, the Chinese could do this because they have a

totalitarian government. And that's, of course, correct. OK, our system is not going
to match their system in terms of values.

But we have a history of developing public-private partnerships that work


effectively without going all the way to an industrial policy. Back after the Great
Depression, this is what we did with the Great New Deal. This is what we did in the
1960s when we developed NASA. That is what we did in the '80s when we came up

with the SEMATECH model.

We can develop much more fulsome, much more economically sensitive

government-industry partnerships. And unless we do that, we are not going to


achieve this. This is not going to be achieved by having better standards come out
of NIST or CISA. They're an important part of the problem, but this needs to be
approached in a much more fulsome, much more systematic, much more systemic

approach. And we have not done that yet. We have some ideas as to exactly how to
do that, which we'll get to later on. But thanks, Chap.

CHAPPELL Oh, that's great. And I think since you mentioned NIST, I got to turn it over to Tony.
LAWSON:
TONY SAGER: And I have to catch my breath after Larry's passionate defense of his ideas. But
yeah, a lot of great stuff there. And number one-- kudos to you and Sean. You
couldn't have picked a more diverse and interesting panel of folks, I think.

And it's just because the ideas are-- many of us have been involved with this stuff
for decades. So just for background, my view is shaped by decades as a security
practitioner. So I grew up at the National Security Agency-- 35 years. Testing for
defense is my life. So everything from the mathematics of cryptography, up through
finding Zero Days in software, to field testing of live operational systems for the

DOD. So this is my life. And trying to make sense of it is what I'm doing at this stage
here.

And so when we talk about success, I'm more of a bottom-up practitioner of this
kind of stuff. But I think you'll see some linkage to the stuff that Mark talked about,
Joel and Larry, also, here. So when I think about success and what we ought to be
aiming for, for me-- and I think this matches with one of Larry's points-- we'll see

success when we talk much less about cyber technology and much more about risk
decision-making.

I mean, we're seeing a massive shift, I think. From our work at the Nonprofit Center
for Internet Security into the-- everybody's in the cyber business, whether they
recognize it or not, right? Every company executive, every board of directors, every
auditor, every regulator-- and they're all just trying to make sense of it.

At the end of the day, yeah, we do have really a mismatched problem here of
economics. You know, the right things to do, technically-- whatever they might
happen to be-- are not the things that people are encouraged to do, or the
behavioral issues really bring them to do.

And so we spend a lot of time talking about, do they have these things? Do they
have the firewall? Do they have this? Do they follow that? And you know, in the
grand scheme of things, people make decisions for all kinds of reasons. And they're
almost never about technology, especially business decision leaders.

So let's talk about cyber-tech more, about how we make decisions-- the way we do

in every other domain of risk in our lives, right, whether it's public health or safety
of a bridge or "is it OK to fly in a commercial airplane?" We don't ask people to be
practitioners. We don't ask them to understand the mechanics of it. We ask them to
make decisions, and we provide mechanisms to help them do that.

Another, I think, theme that will tell us we're on the right path-- and sorry, this is a
little counterculture-- you know, the mantra for the last 25 years has been

"information sharing." When we talk less about sharing and more about what we do
with it, right? Sharing is not the destination. It's a means to an end. And the end is so
that we know enough to make good decisions.

And I believe in my heart of hearts, with decades of experience, the vast majority of
stuff that gets shared about threats and attacks and this and that and the other
thing out there-- most of it is repeats of the same stuff over and over again. And

most of it's telling us stuff we already know. We're not managing our systems well.
There are flaws in our technology. We're following bad processes. Human beings are
getting fooled.

I don't need AI, fancy algorithms to tell me all that kind of stuff. I mean, there's a
need for much better technology than we have today, but the vast majority of what
we do today is actually kind of mundane stuff aimed at mundane problems or

rediscovering problems that we already know. And that's really unfortunate.

Another sign for me of a sort of healthy direction is we've grown-- I've grown up,
anyway, in a build-it-yourself environment. We're going to build security ourselves,
right? We buy technology from the vendors. We get a bunch of guidance from this.
And we compose it and we build it ourselves. That is unsustainable for most of our
economy. That'll never happen. There are enough trained people. The sort of

"build-it-yourself-er" is a flawed model. Right? And many folks have to do it. Most
folks won't do it.

If you look at small and medium businesses, in particular-- this is not going to
happen. And it's not because they don't care or they're lazy. It's that the capability
just isn't there. We need to move much more towards a model of you buy it, right?
It's built into the infrastructure. It's built into the services. And then the role of folks

like us-- I guess, nonprofits.

I've pumped out as much security guidance, I think, as anybody except for maybe
Ron Ross from my time at NSA and at the Center. But at the end of the day, most of
this needs to be built in. And then the role becomes, how do we help people become
smarter buyers of security the way we ask consumers to be good consumers? Right?

We protect them from certain things through regulatory and laws and codification
and so forth and certification of people. And then we have market-driven forces that
allow them to make better decisions. Not perfect decisions, but better decisions,
right, where there's a way for people to operate in society without being paralyzed.
So less of the "do-it-yourself-er." And we encourage that through a lot of different
mechanisms, including policy and regulation. More of the "how am I going to buy it?
"

The other thing I'll mention-- playing off of Mark's comments-- yeah, I mean, I grew
up in federal government. Some of the finest human beings, smartest, most
dedicated people I know, I worked with and they're still there. But we need a
different kind of leadership from federal government. And some of it Larry hinted
at.

This is not about government, the grand convener, handing down requirements
from on high or bringing out the one big bag of money. I grew up in the DOD, so
that's the way we think. One big bag of money and everyone will race in to meet my
requirements. And well it's not that kind of thing. Everything is too dispersed. It's too
interwoven with our economy.

And so you have to see this as a much different kind of role. We need the federal
government. And frankly, we've been saying, all my time in federal government, we
need to be a good example, and we're still not there yet for the federal
government. My advice-- and the teams I worked at NSA-- and if you have followed
my work-- I was probably as public a person as there was at NSA, in terms of
interacting with this community.

My advice to all my employees and to everybody I work with was, when you show up
and you want to help shape and influence the industry, you show up with content.
You show up with draft standards. You show up with ideas. You don't show up to
convene. You don't show up to organize everybody.
The industry-- there's lots of great examples of self-organization out there that we
need to, I think, mobilize towards more common direction. Having spent my last

seven years now, really, in nonprofit space-- I mean and I knew this growing up. I've
got 44 years now in this industry. This industry is full of amazing, talented people of
great goodwill who share this concern that we're in trouble.

And they're willing to put-- I mean, the only reason CIS has a business model is
because of volunteerism. We organized volunteerism. And the astounding talent
that none of us could afford to hire even if we could find them, that we can marshal

towards a common cause. And you can see it in our work, but you can see it in the
Cloud Security Alliance and O-OSP. And some of the stuff that Larry's been involved
with.

I mean, it's astounding, the talent. But we need to organize better, right? And we
need to move some of this towards common cause that is really directed, not this
sort of-- great, we're all saying nice things to each other. We all Kumbaya, you know,

when we get together at RSA, but we are not really directing ourselves. And some
of that-- yes, we need the federal government, some of it, frankly. We in industry,
nonprofit, and for-profit need to get our act together.

And I think one point of hope for me is that many of us who-- I would consider
myself sort of an early-generation warrior in this space. Most of us are now towards
the end of our careers, trying to figure out, what in heck have we done in the last 30

or 40 years? And what's our last hurrah going to be? And we're scattered all across
government, the private sector, and nonprofits, and think-tanks-- and some
examples are even on this panel. We need to do more on our own, right?

No one's going to come down on high to solve this. We need to think about how we
self-organize-- the non-profits, all these different great groups that come together.
Anyway, so that's my thinking. It's a little different-- again, the view of a practitioner.

But I wanted to share with you. A lot of these do have connections to policy and the
kind of behaviors we encourage.

And the last thing is we have a lot of things that we say in this industry, that we say
and we all nod, but we don't actually define. For us, I think it will come up later in
the talk-- cyber hygiene. How many times have I heard, we all need better cyber
hygiene-- for example, and somebody will say, patching or whatever they happen to
say. But if you don't define things, then you don't know how to have a campaign.

You don't know how to measure progress. You don't know how to negotiate. Are you
safe to bring into my supply chain?

We need to be specific enough to make those kinds of decisions about negotiation


and about what I call aggregation. We want to know if a particular sector is getting
better and better faster than whatever the benchmark that Joel mentioned earlier.
So with that, looking forward to the next question. Thank you.

CHAPPELL I think that's terrific. So you all have heard several different definitions of success,
LAWSON: so that there's some consensus. And I think to put the second question in a slightly
more pointed way, I guess we can move through it rapidly. On a scale of zero to 10, I
would like to hear, how far along do you think we are-- either using your own
definition of success or some hybrid yardstick of what you've heard so far? So let
me start with Larry. And I'm just going to take a wild guess that the answer is

somewhere below 10.

LARRY You know, my friends tell me that I'm the guy who thinks the glass is three-quarters
CLINTON: full. I'm really an optimist, and I'm struggling to be able to say we're at 1.5. I mean, I
think it's not that people aren't doing good work. Everybody around the table here
has done. Probably a bunch of people in the audience. I'd like to think we've done
some good work at ISA. But we're getting crushed.

I mean, I talk to people and I say, cyber security. They say, oh, yeah, I hear that's a
big problem. And what I tell them is, oh, no, no, no. It's much, much worse than you
think. And it's true. I also, by the way, had to get back to a comment that Tony
made, which I agree with. And we're at an academic institution so I want to make
some candid comments.

I think we really need to cut back on the happy talk. So we all go to these
conferences and I get up and I say what ISA is doing, and Tony gets up and says
what his organization's doing. We got the Solarium Commission. And then the
vendors get up and they have whiz-bang new technology. And then I go out into the
audience. Particularly, I want to talk to boards and stuff like that, which I do a lot of
nowadays. And one of the things I hear from these guys, they say, hey, sounds like
you guys got this covered!

And I'm like, no! No, we don't have it covered. We are losing. And we are losing big.
We would have to expand the football field to figure out how far away we are from
the goal line. We would have to double or triple it. Cybercrime, according to World
Economic Forum-- it's a $2 trillion-a-year industry now, going to $6 trillion in a
couple of years. And that's a conservative estimate. Cyber Magazine just said it'll go
to $10.5 trillion a year.

Now at $2 trillion, the "cyber criminal nation," if we coupled them as a nation, they
would be in the G-10. They're just a little smaller by revenue than Great Britain. And
they're probably better organized than Great Britain. And what are we doing? The
FBI'S budget to track down this massive criminal-- $4.5 million. That's million with
an M versus trillion with a T.

We are successfully prosecuting less than 1% of cyber criminals, and it has been
this way for decades. Everybody on this panel knows we do not have anything close
to a functional international framework to track and extradite and prosecute cyber
criminals, and we haven't for decades. I don't recall seeing a single hearing ever on
Capitol Hill in the years I've been doing this that focused on the cybercrime problem
in that sense. Not once.

I talked about the fact that the problem that we have here is the economics of
cybersecurity. I know of one hearing in my 20 years that discussed the economics of
cyber-- they talk about the economic impact of cybersecurity, but who cares about
that? I mean, we all know it's enormous. It's hard to figure out, it's so large. We are
getting killed.

And to go back to my friends, the Chinese. I mean, in Washington, D.C.-- Mark, you
can validate-- the big issue of the day-- I mean, supply chain is old news now. Of
course, everybody was really hot on supply chain. Supply chain was the new black.
Now it's 5G.

And that's kind of over. We pretty much lost the 5G fight. The Chinese for years have
been building these telecommunications networks in Asia and Africa, Latin America,

Europe. And their 2G and 3G is Chinese. And the 4G and 5G is going on top of that.
There is no way they are ripping and replacing this as they're fighting the COVID
epidemic.

And Huawei is the tip of the iceberg. I mean, I don't even care about TikTok. I'm
worried about Baidu and Tencent and Alibaba and China Telecom. We are facing a
massive, structural, funded adversary. And we are really still mostly talking about

information sharing and setting standards. We are not stepping up to the plate.

And I do not believe, as I said before, we are not acting like we're married.

[CHAPPELL LAUGHING]

We're acting like-- or we're an estranged couple.

CHAPPELL It's not a good marriage. All right, Mark. I'm going to go to you now because for one,

LAWSON: I know you may have to leave early to be called away to something else. But where
would you put us? Are you as optimistic--

MARK I'll go [INAUDIBLE]. I'll give us a one. And here's the basis. So in 1999, or '98 to 2001,
MONTGOMERY: I worked for Dick Clark at the National Security Council on Counterterrorism and
Critical Infrastructure [INAUDIBLE]. And we wrote PDD 62 on counterterrorism and
an associated counterterrorism plan. I would say about 90% of what we asked for

has been executed. And then another 90% that we hadn't thought of pre-9/11 has
been executed.

Simultaneously, we wrote PDD 63 on Critical Infrastructure Protection-- that was in


'98-- and an associated National Infrastructure Assurance Plan in 2000. I went back
and looked at it when we were doing the Solarium Commission. I waited till we were
done because I didn't want to go back and re-fight battles I'd had then. But I would
estimate that we've accomplished 10% of the assignments in the NIAP.

And even then, the things we created were processes that could help substantially if
everything were going well. So things like the NIAC. I remember writing an executive
order for that-- the National Infrastructure Assurance Council. That still exists. But
we've done about 10%. So there's, mathematically, where we're at.

I do think we're in a lot of trouble. To pick up on something Tony said, it is amazing


the degree that we can't convince people to do personal computer hygiene. And I'll
say what it is-- it's multi-factor authentication, complex passwords, and don't answer
emails from Nigerian princes. If you can do those three things, your personal
hygiene is probably OK.

But the other one's the business hygiene. And this gets into, does your company use
enterprise or retail registrars? Do you use DNS? Do you use DNSSEC? Do you use
DMARC? There is a ten-point check. And I have to tell you, the vast, vast majority of
the 85% of our National Critical Infrastructure that's owned and operated by the
private sector does not pass that test. There's very select elements of our
infrastructure that do. The really big banks do. And they run ops centers.

As a retired two-star, I would have appreciated that ops center as a one-star. It's like
my ops center except with comfy leather chairs. I mean, the banks actually have
good situational awareness. Now look-- that doesn't prevent all crime, but why are
they like that? Because that's where the money was 15 years ago. They've been
under constant attack.

The other thing I would say is-- look, it's not like the 15% that's dot gov and dot mil is

passing all those enterprise cybersecurity hygiene tests I just said. And it's
inconsistent in there, as well. If I could mention one other thing-- there is a piece, a
glimmer of hope on the horizon. And it's not that COVID has taught us a lesson. I
wish COVID had taught us a lesson about non-traditional national security
emergencies, but I don't think that's happened.

The glimmer of hope is ransomware. Contributing significantly to that growth-- not

significantly, but contributing to the $1 trillion in cyber crime is ransomware. What


ransomware is doing is it's making everyone a bank. It's monetizing data.

If you're Prince George's County and you're running utilities and taxes and billing,
and you hold a lot of people's personal information, when someone grabs your
system to either prevent you from getting in or to extract information, taking it out
in return for ransomware, they're monetizing your data. And now you've got to

protect it.

And the reason this is important is our estimate-- and I would like to be corrected, I
think I'm broadly in the ballpark-- is that a good cybersecurity budget is one that
spends $0.10 to $0.12 out of every IT dollar on security. But the reality is, outside of
those banks, it's $0.04 to $0.06 of every IT dollar on IT security.
And this is below the critical nuclear background. We're now below the critical mass
to get any fusion going, right? You only spend $0.04 or $0.06 out of the dollar, you
might as well just spend zero because you've left so many gaps and so many
openings. You can't possibly have the people and processes in place to do
cybersecurity. That number is going to be driven up by ransomware.

At some level, people are going to say, you got me once. I'm going to invest. Next
time CSO-- the new CSO, because you probably fired your CSO-- comes around to
talk to you about cyber security, he or she is going to get the budget they need. So
in my mind, we're at one out of 10. But there's a few glimmers of hope.

CHAPPELL All right. Tony.

LAWSON:

TONY SAGER: Boy, this is pessimistic. But all good. Actually, in the interest of time, let me just tell
a couple of stories that I think will illustrate a position. So many of you know his
name-- Shawn Henry, ring a bell? With CrowdStrike? So Shawn helped start up the
cyber practice at FBI. He's a well-known guy.

Many years ago, I saw him quoted in the paper. It was so good I called him up to
make sure he actually said it. And I can repeat it with his permission. So he said--
must have been 12-plus years ago-- and his quote was, "Anyone in organized crime
who's not getting into this cyber stuff ought to be sued for malpractice." I mean,
yeah, that's what Larry's point was. This is where the money is. They're not going
away.

And in fact, it's gotten better. Anyone who doesn't do this as a criminal, in
espionage, et cetera-- what are they doing instead? Why would you run into a bank
and pull a gun and put your life at risk, right? And so this is a systemic, not-going-
away kind of issue.

The second quick story is-- so my son is in this business now, and he actually has
background in economics. So he's a self-made computer guy, now working in this
space. And he said, Dad, is it too late for me to switch into this career? I said, no,
son. No problem at all. I said, it's really clear from everything that's going on that
my generation will not solve a single foundational problem in computer security in
my lifetime. So lifetime employment for you.

But I need you to get better, right? Because I want my retirement checks to show up
every month. So there's so much to be done yet that has been said in many a
report. Mark gave a great example of-- and I've been involved in a number of these
national commissions and various big studies, both internal to government and
outside, and we're basically re-churning 60% to 80% of the same things over and
over again. All these-- be a good example, fix the economics, clean up the hygiene,
and all that kind of stuff.

So that's the bad news, is we got a lot of things that really need attention. The good
news-- and I'll give us a fourth because I am a hopeless optimist. Because you can't
survive in this business for long unless you're a complete cynic, in which case
there's plenty work for you. Or you're really an optimist and you believe. And I
believe. And those signs I mentioned earlier, right-- the incredible talent, that
people really care about this kind of stuff, the level of awareness. People are
grappling with what to do.

I think those of us that have kind of grown up in this, we need to shift our thinking
about some of these issues-- that Joel mentioned and Larry mentioned and Mark--
around the economics of this, around organizing the policy of this, and getting
together. But the opportunity is there. I think the talent is there. The ideas have
been there for a long time. I think there's a recognition that we're at a point that
really matters, and we should do something about this.

And again, my generation is sort of the first that's heading to the end of the cliff
here and wondering, what is it we could do to leave things better? So I have an
amazing network of people that-- and every one of us on this call does-- that we tap
for these big studies and all these different ideas and panels and appearances and
so forth. We really need to do better than the happy talk. We do need to say, what is
it we need to do, not what we need to talk about anymore? We've admired this
problem for decades, so for me, a four.

CHAPPELL I think that's where we're headed next. But I did want to give Joel the opportunity to
LAWSON: rate us on a scale of zero to 10. And I must say, I would rate higher than everybody
else, probably because I'm looking solely at cybersecurity for critical infrastructure
not the whole ball of wax or every possible meaning of the term cybersecurity or
term that has "cyber" somewhere embedded in it.

And I guess I would say it in four steps. First, there are a very small number of
systems and firms that are really systemically important. Second, if we protect
those, we will dramatically mitigate the chances of systemic failure, which is, to my
mind, where the government ought to be focusing its efforts. Three, the best way to
do that is to have more operational coordination between the government and
industry, which looks a lot like the FSARC, the ARC, something like more in a real-
time War Room situation. And fourth, we already have some models and some
sense of that's where we're going. So that's why I would give us a five. But Joel?

JOEL BRENNER: Oh. I agree with you that there are aspects of critical infrastructure that have
become significantly better, and that's why I'm not giving this a zero. As a nation,
however, I think we're failing. And I'm one through five. I'm a pass-fail guy, here,
and we're failing. You can quarrel about one or two. I think four or five on a five
scale-- I think you're smoking funny cigarettes. We're failing.

I also want to emphasize something that Larry said that's very important. I want to
make sure everybody in the audience understands this. The fundamental problems
are not technological. We do have some technological challenges. The fundamental
problems are managerial, behavioral, economic, and legal.

The incentives we really look for are positive ones in the form that the tax code,
among other things, can deliver. Negative incentives in the form of reasonable
liability that looks like liability in other sectors of the economy, not more than that.
And last is regulation. In a capitalist economy, the incentives are the thing that
makes the world go round or stops it from doing certain things.

There are lots of things we can do here. I mean, I've disagreed with my fellow
panelists in terms of defining what success looks like because I think we should do it

in substantive terms. We've heard a lot about things that all of us agree about,
which is how to get there, but information-sharing or-- these are not goals in and of
themselves. They're ways to get somewhere. But I think we're doing poorly, and we
need to recognize it.

CHAPPELL All right. So let me start the third round, I guess, and we can be at this point moving
LAWSON: into sort of lightning responses. But Joel, if you had a magic wand and you could

wave it once, what would you change to make us better?

JOEL BRENNER: I would focus on infrastructure and isolating controls and reducing the complexity of
components. But again, I think the way to get there would be incentives-- with
positive and negative, as I've described.

CHAPPELL Larry?
LAWSON:

LARRY One word for you, Benjamin. Economics. We have to fundamentally change the
CLINTON: economics of this, and we can do. So before he left, Mark said, well, ransomware!
That's the light at the end of the tunnel because everybody will lose so much. I've

been hearing for 20 years that the big one is going to come and everything's going
to change. I don't think that's going to happen.

Ransomware is going to continue to evolve as our defenses evolve. We need to


develop a dynamic system of cyber-defense. That comes when we change the
economics. Virtually nobody smokes anymore, as opposed to when my parents'
generation-- everybody smoked. You know why? It's not because the Surgeon
General said, smoking causes cancer. We actually already knew that. It's because
the economics of buying cigarettes changed so much. And it led to bad--

CHAPPELL --economics that has to change. I mean, I get the sense from Joel that it involves
LAWSON: regulation. And I get the sense from you that that's not your answer.

LARRY Regulation is appropriate in certain circumstances, particularly, for example, where


CLINTON: the fundamental economics of the industry, such as in much of critical
infrastructure, is involved in economics. But we would change the form of
regulation. That's the important point. The regulatory structures we currently have
are these elongated lists of requirements-- none of which have ever, by the way,
been shown to enhance cyber security.

Read Doug Hubbard's book. He does a great analysis of all of these checklists and
finds it-- first of all, nobody fills them all out, even the banks. And they don't know
whether or not they're supposed to start at one and go to 250, or they could start at
250 and go to 500. What we need to do is adapt these new, more modern methods
of cyber risk assessment-- things like factor analysis of information risk and x-
analytics, et cetera, which puts cyber risk in an economic sense so that
organizations can better assess what their risk appetite is and then can mitigate
down to what their risk appetite is.

And then we combine that with the policy that is already in the National
Infrastructure Protection Plan with regard to critical infrastructure. The private
sector-- and this is a disagreement by the Solarium Commission. The private sector
is charged with providing commercial-level security, not national-level security.
There is a gap between what is commercial-level security and national-level
security, and it is the public sector's job to provide what goes above that gap.

You can't ask the private sector to continually make uneconomic investments in
cybersecurity. We need the economic sectors to continue to work. So we need to

evolve a system of incentives. And tax incentives should work for small companies,
not for the big guys. For the big guys, there are a wide range of non-tax incentives
that we already have in the economy, that we have in agriculture and aviation,
ground transport--

CHAPPELL All right. I got to hear what Mark says because he's still with us.
LAWSON:

LARRY --make a mess with different incentives, so I can tell you. But anyway, I can go into
CLINTON: great depth--

CHAPPELL This is good! No, I think I understand what you're driving at, and it makes sense. And

LAWSON: it is, in fact, the alternative to regulation. But since you took Mark's name in vain--
and he is still with us-- let me turn it over to you.

LARRY Yes, Mark, please.


CLINTON:

MARK Well listen. I agree with most everything that was said today. I certainly think it'll be
MONTGOMERY: challenging to have the federal government help close that gap, but I do think it's a
unique issue. 70 years, we haven't had to defend our critical infrastructure. When
you think about any other warfare area-- the government owns every plane, every
submarine, every tank, every ship. So we really haven't had to factor in the private
sector.

So new logic is required. And we've done new logic before-- the Tennessee Valley
Authority was new logic. How we reauthorized the Ford Motor plant into building
tanks and aviation in the 1940s was. Even SEMATECH to some level was new logic.
But on the issue "if I had one thing"-- first, I like all the things that were mentioned. If
I only had one thing, I'd take leadership.

This is going to take leadership. One reason we created the National Cyber Director
was you actually needed empowered, strong-- I don't want to call them
independent, but someone who has both access to the president and to the Senate
and House to make things happen, to be that belly button for the CEOs to come to
with problems, who wakes up in the morning and their number one issue is cyber.

It's not like the National Security Advisor wakes up and his number one issue's

S400's in Turkey, his number two issue's a carrier in the Arabian Gulf. He gets to
cyber around number 23. You need someone to wake up number one.

If we're going to prevent a nontraditional, national security emergency like COVID


occurring in cyber-- and I don't know if it'll be a big one or if it'll be a sustained
campaign of small ones. But we are losing GD potential and actual GDP every year. I
believe it's in the trillions, like Larry says, overall. And I think it could be worse if a
nation-state really turned their eyes on us.

You need leadership, Rickover-like leadership. You know, someone who takes
something and absolutely applies a standard to it. And it's going to be very hard to
do. And the battle lines have been drawn for 20 years-- between agencies and
between the private sector looking at the federal government saying, you're not
providing much. I show up at the table and I hear the same ridiculous briefs every

time.

Leadership. Hopefully, take a look at our National Cyber Director recommendation.


Obviously, if you picked the wrong person-- if you don't really, really pick the right
person, you're going to in a lot of trouble. And so I hope they identify and pick the
right person, the Biden Administration. And I suspect they will. And when they do, I
think that will help change things. It's going to take a lot more than one person-- but
it's leadership. And I do have to check out, but thank you very much. I've learned a
lot during this, Chap.

CHAPPELL We should go to Tony. You get a chance to wave a magic wand as well.
LAWSON:

TONY SAGER: Oh wow. Well short answer-- I've already hinted at it. And it combines a couple of
things. Mark talked earlier about we know there's a list of things and everybody
ought to do, whether you call it hygiene or foundational or whatever. What we've
done at the Center for Internet Security is we have actually defined what I talked
about earlier, right?

Here's a specific set of things-- and we didn't just make the list up. Most of the stuff
that is created-- all those regulatory lists and checklists and things like that-- good
people get together and they argue until they come up with a list. But as I think
Larry might have said, right, so what value does that have against the attacks that
are happening?

Well it's good to do. These are good things to do. That's how we've sort of
traditionally looked at this. We don't have the kind of data or the modeling that we'd

expect to underlie the actuarial science that we have in many other domains. And
we're a ways from getting there.

So what we've done, though, is analytically look at what we think is a reasonable


and small and foundational set of things, bounced it against the best available
summarized data of the attacks that are in the wild, and model that using the
closest thing we have to an industry standard. I won't go through the technical
details, but the idea is everything that you could do in defense has either some
impact or no impact upon the adversaries' lifecycle, their ability to do
reconnaissance, to do the first thing, to move around, to exfiltrate data.

And so you want to be able to look at your list of things to do in an analytic way, to
make sure that it has some specific security value to it. But that's not enough.
That's why there's no one thing, even in a lightning round that I could tell you. So

when we're working with states-- a couple of states now, and a number of others are
thinking about it-- what they're moving to a model is not of regulation, but to
incentivize voluntary adoption.
That is, to offer some level of liability protection or safe harbor. If you can
demonstrably say you are putting in place these practices-- and they include a
reference to the NIST cybersecurity framework and to the work at the Center for
Internet Security, right? So you have to kind of combine specifically what you want
people to do with some incentivized way for them to look at that. Right? As opposed
to this-- because as soon as you try to crush it down as mandatory, there's so much
push-back to all that kind of thing.

So we're looking at this as a model. people have been coming up with this. People
are studying it to see, is this a more scalable model, at the federal level or

otherwise? So we have to find a way to address the economics in addition to what's


the most important technical thing to do. So that's what we're doing at the Center.

CHAPPELL I'd like to bring Taylor Reynolds into this conversation in a sec. But I feel like, I didn't
LAWSON: mean to cut you off, Larry, so I should go back to you. And I mean, I'm hearing talk
about incentives, but I'm not exactly sure of the form they would take.

And really, if incentives to the private sector to invest in a higher level of security
when we don't know which investments would actually generate security-- to me,
this may not be the best public policy instrument. It might be better to instead insist
that the truly crucial firms get together and share information in a small enough
group where you could establish some trust and then direct resources in a sort of
peer-reviewed fashion among those firms. That strikes me as a different model than
either the tax incentives, which is kind of a blunt instrument, or subsidies for larger

firms to make checklist-style investments.

LARRY So if you're asking me, we think that the incentives need to be calibrated to the
CLINTON: industry sector. So you need a different sort of incentive in the utility sector than
you do in the defense sector, than you do in the IT sector. So we look for a menu of
incentives, and these incentives all ought to be calibrated to really one of our
fundamental and most longstanding requests. Tony will remember this from the
days when we were developing the NIST framework. These frameworks, these
procedures need to be empirically tested for cost-effectiveness.

We have no idea if the NIST framework helps in terms of security at all. I suspect, by
the way, some of it does. I suspect some of it doesn't. And I think some of it is
effective but not cost-effective. 10 years later, we need to know these things. And
then you calibrate the incentives to the cost-effective mitigation steps, based on

what is good for that particular sub.

That's why, when I said at the top-- we advocate for the Office of Digital Strategy
and Security, which is a much broader approach than the directorship that is in the
Solarium Commission. That's a good idea. That's a step in the right direction. But we
need something much broader. And part of their job would be to be testing the
government programs the way we do in industry. Whenever we inject a program, we
test it, and then we modify it based on what's going-- we don't do that in
government!

We should be doing that. And then you apply the incentives based on what is cost-
effective. If things are cost-effective, we shouldn't need to be providing an incentive
for it. But if things are effective but are cost-prohibitive, that's when you provide the
incentive. And these incentives do not have to be monetary. There are liability
incentives, as Joel has pointed out. There's insurance incentives. There are

procurement incentives. There are very creative forms of incentive.

For example, in pharmaceuticals, if you have a good track record with the safety of
your drugs, you can get to the front of the line-- a fast-track to get your new drugs
improved. We should have a situation like that for IoT because the problem with IoT
is that they just rush this stuff to market so quickly because you've got to get to
market quick. And if now there was a cyber incentive that gets you to the front of
the line to get your product to market faster, that's a market incentive that doesn't
hurt anybody, doesn't cost any money for the federal government, and could
actually result in industry itself having a dynamic reason to improve their own
security.

That's what we are saying. And by the way, one of our proposals that this Office of
Digital Strategy and Security would enact would be to create a study to define what

these incentives are even more broadly than we have here. So we're not interested
in a cookie cutter approach. We're interested in a very strategic metrics-oriented
approach.

And that should go-- by the way, going back to regulation. The regulation should be
economics-based and very, very much based on research and cost-effective. And
we can do all this, by the way. And it doesn't cost a whole lot of money to do all this.
we're not talking about a massive federal program to do studies.

Studies are what we do to kill bills. We give somebody a study. We could study the
NIST framework, and we could've done this 10 years ago. So no. We think there's a
lot we can do for very little money and with some creativity. And by the way, that's
what we're supposed to be good at here in the United States, is that kind of stuff. So
I think we can do this, but we're not doing it.

CHAPPELL Taylor?
LAWSON:

TAYLOR Hi. Thank you very much. This has been really a fascinating discussion. And I love
REYNOLDS: the focus that people have put on the economics behind this. I'm an economist by
training, and digging into cybersecurity when Joel held his workshops on protecting
critical infrastructure about five years ago, I was surprised at how little focus there
was on the actual numbers behind these attacks, behind the losses. And for an
economist, that makes it very difficult for us to make decisions. We have to have
some hard and fast numbers to work with.

Larry, I loved how you mentioned the economics of cybersecurity over and over,
that it's not being discussed enough. I'll have to tell you, I was really surprised in
talking to CISOs about the losses that they're taking by being hit all the time with
these cyber attacks. When we talk to them, they say, well, we don't really calculate
the losses.

What we do is we typically have a bucket-- we have the low, medium, and the high.
And if it's in the "high" bucket, then we'll turn it over to the forensic accountants who
are working for the insurance provider. And they'll figure out how much we lost. But
other than that, it's really like, well we're just getting hit a lot, and we don't know
what's happening.

The problem with that is that then if we try and come up with these numbers like $2
trillion in losses that we heard about, where is that number coming from? Because I

feel like no one has a clue how much money they're losing. And so we have a
platform that we're doing now to try and put numbers behind how frequently
controls are successfully attacked and how much money is lost when they're
successfully breached.

So we built a platform to do this. We've run some computations using multi-party


computation to keep data secure. And we're starting to put some of those numbers
there. But one of the difficulties has really been getting the firms to figure out how
much money they're losing.

And the other side of this is, if you are a CISO, you need to convince the Chief
Financial Officer, you need to convince the board that they should give you more
resources to attack this or that. And they speak the language of dollars, of money
and dollars. And the problem is we're just not making that happen very well.

I just wanted to point out one other quick thing that the platform we've built-- this
SCRAM platform, stands for Secure Cyber Risk Aggregation and Measurement.
We're running computations with very large firms to pull out what's really
happening behind the scenes and getting some numbers on those. Our goal is to be
able to empirically test for cost-effectiveness.

So while we have these different frameworks out there, we want to be able to test
them and say, well are they capturing the things that tend to be losing the most
money? So very interesting. I'd be really interested in your thoughts and ideas of
how we can get firms to think more about the actual losses that they're incurring.

CHAPPELL Thank you so much, Taylor, for adding that. I also see a couple of questions in the

LAWSON: chat that I want to make sure you all take a swing at. One is from my colleague in
nuclear engineering, Scott Kemp, who suggests moving the focus away from
economic cyber crime, critical infrastructures, in ways more important to the life of
the nation and might be easier to crack because much of the critical infrastructure
can be physically configured to make failures more contained and maybe more
rapidly recoverable if we take the right physical steps.

So I would love to hear your thoughts on that. And how many different critical
infrastructure sectors-- truly critical, not just with that terminology-- actually critical,
critical infrastructure sectors. Can we simply build in air gapping and other physical
remedies? And then a question from Jared Gansel, also here at MIT. I think it might
be focused at Tony, but others could weigh in.
We continually face this challenge of convening public and private stakeholders in

the emergency management field into productive discussions for systems thinking
and making sense of the problem. And have you faced this sort of challenge to
cybersecurity? And if so, what have you done that has worked to move the
conversation forward, beyond what you've already mentioned, Tony? And do you
have any advice from the cybersecurity world that might be relevant for, say,
emergency management?

Jared also asked another question, and because he's Jared, he gets two. This is
probably for Larry. These are sometimes low-likelihood events, the prospect of a
concerted attack by a well-resourced adversary, so that it never really enters into
the capital budgeting process for corporate managers. And how do you address that
specific incentive? So let me go to Tony first. And then Larry, then Joel.

TONY SAGER: Yeah. Thanks. A great question. The remark came up earlier about-- again, I was in

government for 35 years. And actually it was in 2001, I led the campaign to release
the NSA Security Guidance, which we developed for the DOD and the intelligence
community as a function of red-team testing and that kind of stuff, released it to the
public through NSA.gov in the early days of government stuff going on the web.

And that gained us more credibility, more friends, and more relationships with the
industry than anything else we could have done. I had to convince bosses that the
world was changing, right? Open source and open standards-- that you get more
ability to be part of the environment by giving stuff away than you do by trying to
retain control of it, right? As opposed to setting up a unique government standard.

So I sent lots of people out in the industry to work with standards groups. A number
of the early security standards and security automation were written by folks that
worked in my group and that kind of a thing. But the idea was to be involved in a lot

of these activities directly, bringing smart people-- a lot of great people work in
government. But if they're heading behind concertina wire, then it's easy to treat
them as "those government guys don't know what they're doing."

So for me, that was bringing a lot of openness to that. A lot of that got undone,
frankly, for lots of current event kind of things over the last few years. I think the
common wisdom among my industry friends is that the government's been absent
from a lot of activity around the world and standards bodies. Larry mentioned 5G. If
you're not there, then you're not shaping the discussion, right? So you have to be
out there.

And again, people, when you show up and you say, "We're the government, we're
here to organize you," people aren't interested anymore. That's just not the way this
flows. You have to think of this as-- we come out as participants. We bring smart
people with big ideas and documented thoughts and so forth around that.

So that was, for me, again, sort of a-- as I shifted from government, the whole model
of CIS is we create stuff through volunteers, and we give it away. You still have a
company to run, and we support it through a membership model. But it's this idea
that the greater good is best served by this free availability of content.

And not everybody can pull that off. But we're lucky enough to be in a space that
allows this to happen, which then creates partnerships and friendships and
opportunities to create new things. Thank you.

CHAPPELL Great. Larry?


LAWSON:

LARRY Sorry, I was on mute. First of all, ISA has the same model that Tony has, and it does
CLINTON: work. Just quickly on that, I think that what Tony is talking about and what I was
alluding to before-- we need to fundamentally alter the thinking. We have to rethink
how we're doing these things.

And we can do these by industry, government consortia that are much more equal.
We did this with NASA. We did this with SEMATECH. By the way, that's largely how
CMMA was developed, is they had a much more egalitarian structure with co-
chairmanships, an equal number of industry and DOD people sitting around the
table working out what CMMA was.

So this kind of stuff can be done. So that's the direction we need to go to. With
respect to really critical infrastructure, I'm not positive exactly what we mean by
"really critical infrastructure."

TONY SAGER: Super critical.


LARRY -- seems to be pretty critical, but I really different. I suspect you're talking about
CLINTON: utilities--

CHAPPELL Not restaurants. I'm not talking about the food sector.
LAWSON:

LARRY So I have always been on the end where I thought, yes, these are very low-level
CLINTON: threats, taking down the telecom system or taking down the electric grid. But the
reason was economics. So the Russians and the Chinese have, for a decade or so,
been able to take down the telecom system or the electric grid, but they wouldn't do
it because their economy is interconnected to our economy.

If you crash the electric grid on the East Coast in winter-- which was fantasized back
in the days where they had these regulatory bills floating through Congress. That
would crash the US economy, which would have crashed the Russian economy and
crushed the Chinese economy, which would've crashed the political systems. So
they weren't going to do it. They don't want to take down these major
infrastructures.

They want to use them. The Chinese don't want to take down the internet. They
want to use it. They're making tons of money off it, and so are all the criminals. Now
to get to how do you deal with these things? And do the more traditional-- for
example, the crime prevention, cyber techniques-- will they help? Yeah! You
motivate people in their self-interest to harden their systems, follow good
documented hygiene and mitigation devices, by putting in risk assessments that are
modern risk assessments that do these in an economic fashion.

And they will harden the systems. And that hardening of the system that defends
them against sophisticated cybercrime attacks-- and by the way, the cybercrime
attacks are just as sophisticated as the nation-state attacks at the high end. And if
you're hardening against the cyber crime attacks, you're similarly hardening
against the nation-state attacks. They're pretty much the same thing at this stage.

One of the cyber criminals used to work for the nation-states. Now contract-- went
into business for themselves. So that's the answer here, is that we have to reach
people's mutual self-interest, make this economic for all of us. It's in all of our self-
interest.

And remember, we're losing tons of money by not doing this. The estimates--
Taylor's of course right. I mean, is it $10 trillion or is it $5 trillion? Doesn't make that
much difference to me. It's tons and tons of money, huge drag on the economy.

We'd be much better off if we just invested in this. And we don't invest in cyber
security at all. We're trying to do this on the cheap. Remember, we're spending
$450 million on the FBI to follow a trillion-dollar cyber crime. The Chinese are
spending $1 trillion the Digital Silk Road, and they're spending a whole bunch on top

of that. And we can get to it, but they're crushing us worldwide. They're making
friends and influencing people. This is a digital Marshall Plan. And they're winning.

So we have major-- if you want to be concerned about the threats to the United
States, that's our major threat, is that they are turning over the Western world order
to try to put Chinese currency as the dominant currency. And they're 20, 30 years
away from being able to do this. And I take it from some looks on the panel that you
don't believe me. It's true. So.

CHAPPELL It's sounding like if we asked you about Chinese-American relations, we would
LAWSON: probably get an answer there was even below your 1.5 you mentioned. Let me ask--
Joel, if you would, take a swing at Scott's question. And any others, if you want to
weigh in on that.

JOEL BRENNER: I don't agree with Scott's premise. Or at least, perhaps, we haven't been clear about
it. Scott writes that our discussion has largely been informed by economic cyber
crime. Really, not only because of what Larry just said-- that cyber hardening and
cyber crime hardens against other things-- but let me give you an example of
something that has nothing to do necessarily with cyber crime.

Electricity in the United States is governed by the federal government only at the
interstate transmission level. And it's governed by public utility companies and state
law at the generation and distribution levels. The public utility commissioners are in
very politically sensitive positions. Security has to be paid for, but raising rates is
politically difficult, especially politically difficult when lots of people are suffering
and can't make ends meet now.
And yet, security has to be paid for. So there's lots of rooms in a place like this for
incentives of several kinds-- some just outright subsidies, perhaps. Another might
be using the federal ability to regulate transmission as a lever into utility operations
at the distribution generation level. I think that would require people who would do
this for a living to figure out the details of that.

I can't. But I'm pretty sure we could do that. So Larry's been talking-- if I can speak
for him for just a second, although he speaks well for himself. And Mark and Tony,
the incentive issue goes way beyond the cyber crime problem. It has to do with
critical infrastructure and businesses in general and investment.

But really, we're talking about incentives, both legal negative incentives and
positive incentives, not just about cyber crime. By the way, Chap-- I put regulation
at the bottom of that list, not at the top of that list. In a capitalist economy,

regulation's much less important as a driver of behavior than tax, positive


incentives, and liability. That's where I think we need to go. I hope that clarifies our
position for Scott.

CHAPPELL Sean? I'm going to turn it over to you for a final question. And then, any benediction
LAWSON: you want to offer, because I know we have to wrap up at 3:30.

SEAN ATKINS: Thanks, Chap. So one of the questions-- and this is probably more geared toward
Mark, since it's a Cyber Solarium-based question, but I think the rest of the group
might have some insight into it, too. And so, in reading the Cyber Solarium report,
the three pillars are focused on normative building-- so mostly a diplomatic
function, building resilience-- where I think that fits mostly the discussion we've had
here, which has a lot to do with public-private partnership, and building
collaboration and those sorts of things, and incentives to building as well.

And then the third is imposing costs, which tends to focus on military-related
actions. But the question is, is the first and the third-- building norms and imposing
costs. There's not a lot of emphasis on public-private partnership there. But I think
in reading and talking to folks, that there is space for that, or at least the beginnings
of working together on those two parts.

An example would be, Microsoft or, I think, Siemens both have initiatives to try to
build some normative developments, maybe frustrated with government efforts and
cost imposition. If the military is to impose costs in a defend forward-type strategy,
that takes some knowing about the private sector side and what they're defending.
So I was wondering if folks could speak to that a little bit?

CHAPPELL Maybe, maybe, Tony, you want to start?


LAWSON
(OFFSCREEN):

TONY SAGER: Yeah. I'm not sure where to go with that, Sean. But I mean, I think the point is, I think
there is plenty of room for figuring out things like-- people want to partition this
problem neatly into critical, non-critical, government, non-government. And I just
think that doesn't make any sense. There's too much overlap. There's too much
mixed use.

Like in the DOD, right? We never go to war without our friends. The complication is
we don't know who our friends are until we go to war.

And so you're always you always have to think of this as a composition problem.
And so people say "critical," there's an implication that everything else doesn't
matter. My view is completely the opposite of that. Everything matters. Some things
matter more. Everything deserves some level of protection. Anybody that does a
risk assessment that says, "This part of my infrastructure is not critical so I don't
have to worry about it" is going to make a mistake. That's just guaranteed to be
wrong.

So whether it's public or private, to me, you have to think of it as a whole, and then
therefore, you need to create those-- both the practices, the incentives, and the sort
of management machinery as a whole.

JOEL BRENNER: Yeah. I thought on this-- one aspect, I think it was Sean, you were raising, has to do
with our deterrence policy. Or more accurately, our utter lack of one, and the
imposition or inability so far to impose significant costs on people who attack us.

I believe that the Solarium Commission was right about this in two respects. One is
that, in the United States, we have no choice but to look at the deniability aspect of
deterrence policy. We must harden our targets. But we have absolutely relinquished
up until now-- until recently, anyway-- the intention of imposing costs-- that's to say,
punishment, to be frank about it-- on people who attack us. And this is just an
absolutely losing proposition.

I think that the cyber comms' "defend forward" policy, or its persistent engagement
policy, is fundamentally correct. The difficulty for those of us who are now on the
outside-- Tony and I used to be on the inside-- is that you don't really understand

what they're doing, except in a few cases where they tell you unless you can see the
target list. And we can't. And we shouldn't.

But we cannot sit by and allow people to attack us without smacking them back.
And I'm talking about smacking them back in non-kinetic fashion, below the level of
armed attack, as defined in the law of armed conflict. So I think this is a very
important part of what we have to do. And at the same time, I think the
fundamentally most important part of our deterrence policy must be becoming a
much harder target.

LARRY Yeah. I would agree with Joel's comments in that regard, although I will confess, I
CLINTON: have not figured out the proper way to do cyber deterrence in this particular sense.
So it's for greater minds.

I would associate myself with Tony's comments. Notwithstanding and with all
respect, I'm not a big fan of the designations of critical infrastructure with respect to
cybersecurity because the fundamental definition of quality of the internet is
interconnection and interdependence.

So in one of our books that we did for the Association of Corporate Directors, we
had this example of this highly important secure facility, this element of critical
infrastructure that was doing a real good job with security. And then the bad guys
found out that at lunch, the employees at this facility liked to use a local Chinese
restaurant. And so what the bad guys did is they loaded the malware onto the online
menu at the Chinese restaurant and through there, got into the-- so now Chinese
restaurants become part of our critical infrastructure.

The interdependency-- so we need a whole system sort of solution, which is much,


much more difficult, I understand. I think the whole critical sector thing is really a
holdover from physical security. We don't have critical assets in cyber. We have
critical functions.
And let me tell you, it was a long fight with DHS 10 years ago when we were in the
ITSCC-- the IT Sector Coordinating Council, and we had to do a sector-specific plan.
And they wanted us to identify our critical assets. And we said, no, we don't have
critical assets. We have critical functions.

My point being, we have to rethink all of these things. We are holding over a bunch
of ideas from the last war that I think are impeding us from moving ahead into the
digital age. And we really don't have anybody in government doing that. We don't
have an office where they're in charge of what we would call in industry "digital

transformation." What are we going to do in the digital age? How do we look


different? How are we going to process, et cetera?

And I think a lot of these things, for example deterrence, would be located there.
Obviously, the DOD guys are involved. But I mean, I think we have to think about
this in a 21st century model, and I don't think we are doing that completely.

CHAPPELL I think, I suspect we have to wrap up. I'm getting the greasy eye from Sean, so I'm
LAWSON: surrendering my role as moderator and the microphone to him.

SEAN ATKINS: Excellent. Yep, we are definitely over time now. So I think, it's probably just
appropriate to say thank you to all the panelists. Tony, Larry, Mark, and Joel-- thank

you for taking the time to have this great discussion and sharing your thoughts.

Thank you, too, to the Center for International Studies team for putting this
together-- Michele and Lauren, thanks-- as well as the Internet Policy Research
Initiative, Cybersecurity at MIT-Sloan, and then Cyber Politics at MIT for supporting.
And then finally, thanks to everybody that has joined the conversation. Greatly
appreciate it, and I hope it was useful for you.

You might also like