Knowledge Management in The Intelligent Enterprise
Knowledge Management in The Intelligent Enterprise
Intelligence Enterprise
Artech House
Boston London
www.artechhouse.com
All rights reserved. Printed and bound in the United States of America. No part of this book
may be reproduced or utilized in any form or by any means, electronic or mechanical, including
photocopying, recording, or by any information storage and retrieval system, without permission
in writing from the publisher.
All terms mentioned in this book that are known to be trademarks or service marks have been
appropriately capitalized. Artech House cannot attest to the accuracy of this information. Use of
a term in this book should not be regarded as affecting the validity of any trademark or service
mark.
International Standard Book Number: 1-58053-494-5
Library of Congress Catalog Card Number: 2003041892
10 9 8 7 6 5 4 3 2 1
Contents
Preface
xiii
1.1
1.2
Categories of Intelligence
10
1.3
12
1.3.1
1.3.2
15
17
1.4
17
1.5
20
1.5.1
1.5.2
1.5.3
20
21
22
1.6
23
Endnotes
Selected Bibliography
24
26
29
2.1
29
vii
viii
2.2
33
2.3
2.3.1
2.3.2
35
37
38
2.4
40
2.5
42
2.6
2.6.1
2.6.2
44
45
45
2.6.3
46
2.7
48
51
55
3.1
56
3.2
3.2.1
3.2.2
62
63
68
3.2.3
71
3.3
74
3.3.1
3.3.2
3.3.3
The Situation
Socialization
Externalization
75
77
78
3.3.4
3.3.5
Combination
Internalization
79
79
3.3.6
3.3.7
Socialization
Summary
79
80
3.4
Taxonomy of KM
80
3.5
Intelligence As Capital
85
3.6
93
3.7
3.7.1
96
98
Contents
ix
3.7.2
3.7.3
98
100
3.8
Summary
Endnotes
102
104
107
4.1
109
4.1.1
4.1.2
4.1.3
110
112
115
4.1.4
4.1.5
Initiating KM Projects
Communicating Tacit Knowledge by Storytelling
117
118
4.2
4.2.1
Organizational Learning
Defining and Measuring Learning
121
122
4.2.2
4.2.3
123
125
4.3
4.3.1
4.3.2
Organizational Collaboration
Collaborative Culture
Collaborative Environments
129
131
133
4.3.3
137
4.4
4.4.1
4.4.2
142
143
147
4.4.3
150
4.5
151
4.6
Summary
153
Endnotes
153
159
5.1
160
5.2
5.2.1
167
167
5.2.2
Inductive Reasoning
168
5.2.3
Abductive Reasoning
173
5.3
175
5.4
180
5.5
186
5.6
Summary
190
Endnotes
191
195
6.1
196
6.2
198
6.3
Applying Automation
203
6.4
205
6.5
207
6.6
6.6.1
6.6.2
209
210
211
6.6.3
6.6.4
213
215
6.7
223
6.8
229
6.9
Summary
Endnotes
235
235
241
7.1
241
7.2
245
7.2.1
7.2.2
Data Storage
Information Retrieval
245
247
7.3
249
7.4
256
Contents
7.5
xi
264
7.5.1
7.5.2
Information Visualization
Analyst-Agent Interaction
264
265
7.6
Summary
Endnotes
267
268
271
8.1
272
8.2
8.2.1
Automated Combination
Data Fusion
275
277
8.2.2
8.2.3
Data Mining
Integrated Data Fusion and Mining
283
288
8.3
8.3.1
289
292
8.3.2
293
8.4
Summary
294
Endnotes
294
299
9.1
300
9.2
302
9.3
306
9.3.1
9.3.2
9.3.3
308
309
312
9.3.4
9.3.5
9.3.6
315
316
318
9.3.7
319
9.4
Summary
322
Endnotes
324
xii
10
327
10.1
Role of IT in KM
327
10.2
331
10.3
A KM Technology Roadmap
332
10.4
10.4.1
Key KM Technologies
Explicit Knowledge Combination Technologies
335
335
10.4.2
337
339
Summary
Endnotes
340
340
343
Index
345
10.4.3
10.5
Preface
In this rapidly changing and volatile world, the expectations required of those in
the intelligence discipline are highknowledge of the hidden and foreknowledge of the unpredictable. The consumers of intelligencenational policymakers, military planners, warfighters, law enforcement, business leadersall expect
accurate and timely information about the their areas of interest and threats to
their security. They want strategic analyses, indications and warnings, and tactical details. This book is about the application of knowledge management (KM)
principles to the practice of intelligence to fulfill those consumers expectations.
I began this manuscript shortly before the September 11, 2001, terrorist
attack on the United States. Throughout the period that I was writing this
manuscript, the nation was exposed to an unprecedented review of the U.S.
intelligence organizations and processes; intelligence has entered our nations
everyday vocabulary. Unfortunately, too many have reduced intelligence to a
simple metaphor of connecting the dots. This process, it seems, appears all too
simple after the factonce you have seen the picture and you can ignore irrelevant, contradictory, and missing dots. Real-world intelligence is not a puzzle of
connecting dots; it is the hard daily work of planning operations, focusing the
collection of data, and then processing the collected data for deep analysis to
produce a flow of knowledge for dissemination to a wide range of consumers.
From a torrent of data, real-world intelligence produces a steady stream of reliable and actionable knowledge. Intelligence organizations have performed and
refined this process to deliver knowledge long before the term knowledge management became popular; today they are applying new collaborative methods
and technologies to hone their tradecraft. This book focuses on those methods
and technologies.
xiii
xiv
1
Knowledge Management and
Intelligence
This is a book about the management of knowledge to produce and deliver a
special kind of knowledge: intelligencethat knowledge that is deemed most
critical for decision making both in the nation-state and in business. In each
case, intelligence is required to develop policy and strategy and for implementation in operations and tactics. The users of intelligence range from those who
make broad policy decisions to those who make day-to-day operational decisions. Thus, the breadth of this product we call intelligence is as wide as the
enterprise it serves, with users ranging from executive decision makers to every
individual in the enterprise, including its partners, suppliers, and customers.
First, we must define the key terms of this text that refer to the application
of technology, operations, and people to the creation of knowledge:
Knowledge management refers to the organizational disciplines,
understanding that is covered by complexity, deliberate denial, or outright deception. The intelligence process has been described as the
process of the discovery of secrets by secret means. In business and in
national security, secrecy is a process of protection for one party; discovery of the secret is the object of competition or security for the competitor or adversary. The need for security in the presence of competition,
crisis, and conflict drives the need for intelligence. While a range of definitions of intelligence exist, perhaps the most succinct is that offered by
the U.S. Central Intelligence Agency (CIA): Reduced to its simplest
terms, intelligence is knowledge and foreknowledge of the world around
usthe prelude to decision and action by U.S. policymakers [1].
These classical components of intelligence, knowledge, and foreknowledge provide the insight and warning that leaders need for decision
making to provide security for the business or nation-state [2].
The intelligence enterprise encompasses the integrated entity of people,
nation-state collaboration (in both cooperative trade and coalition warfare), economics, and communication has increased the breadth of
intelligence analysis to include a wide range of influences related to
security and stability. While intelligence has traditionally focused on
relatively narrow collection of data by trusted sources, a floodgate of
open sources of data has opened, making available information on virtually any topic. However, these new avenues come with the attendant
uncertainty in sources, methods, and reliability.
Depth of knowledge to be understood. Driven by the complexity of opera-
policymaking, military warfare, and business operations is ever increasing, placing demands for the immediate availability of intelligence
about the dynamic world or marketplace to make nation-state policy
and business strategy decisions.
Throughout this book, we distinguish between three levels of abstraction
of knowledge, each of which may be referred to as intelligence in forms that
range from unprocessed reporting to finished intelligence products [3]:
1. Data. Individual observations, measurements, and primitive messages
form the lowest level. Human communication, text messages, electronic queries, or scientific instruments that sense phenomena are the
major sources of data. The terms raw intelligence and evidence (data
that is determined to be relevant) are frequently used to refer to elements of data.
2. Information. Organized sets of data are referred to as information. The
organization process may include sorting, classifying, or indexing and
linking data to place data elements in relational context for subsequent
searching and analysis.
3. Knowledge. Information once analyzed, understood, and explained is
knowledge or foreknowledge (predictions or forecasts). In the context
of this book, this level of understanding is referred to as the intelligence product. Understanding of information provides a degree of
comprehension of both the static and dynamic relationships of the
objects of data and the ability to model structure and past (and
future) behavior of those objects. Knowledge includes both static content and dynamic processes.
These abstractions are often organized in a cognitive hierarchy, which
includes a level above knowledge: human wisdom. In this text, we consider wisdom to be a uniquely human cognitive capabilitythe ability to correctly apply
knowledge to achieve an objective. This book describes the use of IT to support
the creation of knowledge but considers wisdom to be a human capacity out of
the realm of automation and computation. IT can enable humans to gain experience through training, simulation, and enhanced understanding of real-life
events; this way, technology can contribute to a humans growth in wisdom [4].
Sun Tzus treatise also defined five categories of spies [6], their tasks, and
the objects of their intelligence collection and covert operations. More than
seven centuries before Sun Tzu, the Hebrew leader Moses commissioned and
documented an intelligence operation to explore the foreign land of Canaan.
That classic account clearly describes the phases of the intelligence cycle, which
proceeds from definition of the requirement for knowledge through planning,
tasking, collection, and analysis to the dissemination of that knowledge. He first
detailed the intelligence requirements by describing the eight essential elements
of information to be collected, and he described the plan to covertly enter and
reconnoiter the denied area:
When Moses sent [12 intelligence officers] to explore Canaan, he said, Go
up through the Negev and on into the hill country. See what the land is like
and whether the people who live there are strong or weak, few or many.
What kind of land do they live in?
Is it good or bad?
What kind of towns do they live in?
Are they unwalled or fortified?
How is the soil?
Is it fertile or poor?
Are there trees on it or not?
Do your best to bring back some of the fruit of the land. [It was the season
for the first ripe grapes.]
(Numbers 13:17-20, NIV) [7].
A 12-man reconnaissance team was tasked, and it carried out a 40-day collection mission studying (and no doubt mapping) the land and collecting
crop samples. The team traveled nearly 200 miles north from the desert of Zin
(modern Gaza) observing fortified cites and natural resources. Upon return, the
intelligence observations were delivered and the data analysis and report synthesis
phase began as the leaders considered the implications of the data (Numbers
13:26-33). As all too often is the case in intelligence, the interpretation of the
data and judgments about the implications for the Hebrew people were in
severe dispute. In the account of this analysis, the dispute over the interpretation of the data and the estimated results once disseminated to the leaders and
the nation at large led to a major national crisis (see Numbers 1415).
The analysis of intelligence data has always been as significant as the collection, because analysis of the data and synthesis of a report creates meaning
from the often-scant samples of data about the subject of interest. Before
becoming the first U.S. president, George Washington commissioned
intelligence-collection operations when he was a general officer of the revolutionary army. He recognized the crucial importance of analysis. In a letter of
appreciation to James Lovell in April 1782, Washington specifically noted the
value of all-source intelligence analysis and synthesis that integrates disparate
components of evidence:
I THANK YOU FOR THE TROUBLE you have taken in forwarding the
intelligence which was inclosed in your Letter of the 11th of March. It is by
comparing a variety of information, we are frequently enabled to investigate
facts, which were so intricate or hidden, that no single clue could have led
to the knowledge of them in this point of view, intelligence becomes interesting which but from its connection and collateral circumstances, would
not be important [8].
radically shifted at the transition between waves. Second, each distinct wave is
characterized by its means of wealth productionand a central resource at the
core of the production mechanism. Third, technology is the cause of the rapid
transitions, because as new technologies are introduced, the entire basis for
wealth (production) and power (the potential for economic strength and
destruction) change. These changes between waves also bring the potential to
rapidly change the world order. Finally, each new wave has partitioned the
nation-states of the world into categories, each characterized by their maturity
(e.g., an information-age society is characterized as third wave). The world is
now trisected into nations in each of the three wave categories.
Table 1.1 summarizes the three waves identified by the Tofflers, transitioning from the agricultural to the information age. The agricultural wave was
characterized by peasant-based crop production, dependent upon the central
resource of land ownership. The industrial age rapidly shifted the balance of
world power, as raw materials for mass production became the central resource.
Mass production, and the comparable ability to wage mass destruction, transferred power to the nation-states with industrial technology.
The last decades of the twentieth century brought the transition to a new
information age, in which the Tofflers asserted:
Information (the raw material of knowledge) is the central resource for
and economies.
The intelligence discipline has always faced a competition for informationcritical information about competitors and adversaries. Table 1.1 also distinguishes the significant transitions in the focus of intelligence throughout the
Tofflers waves of civilization. Throughout the agricultural age, intelligence collection remained centered on human observation and interaction, or human
intelligence (HUMINT), as cited earlier in the accounts of Moses, Sun Tzu, and
General Washington. This human collection-centric means was dependent upon
physical human access and covert means to communicate information from collectors to decision makers.
The industrial age introduced increasingly complex remote sensing instruments and stand-off collection platform technologies, ranging from early
telescopes and hot air balloons to postWorld War II radars and more recent
satellite platforms. These sensors and platforms combined to provide
Agricultural
Industrial
Information
Approx. Period
Until1700
17002000
2000Future
Wealth Creation:
Power and
Business
Object of
Nation-State
conflicts: land
Warfare, Conflict,
and Competition Infantry warfare:
attrition of
infantry (target
human bodies)
Focus of
Intelligence
Intelligence
Examples
Network centric
(network
access)
Knowledgecentric
(perceptual
access)
Future emphasis
on human
congition,
decision-making
and influence
revolutionary, powerful intelligence-collection capabilities. Intelligence consumers increased their dependence on these sources to complement and validate
their traditional HUMINT sources. Aggressors orders of battle were essentially
hidden until radar, electro-optics, and radio receivers were refined throughout
the Cold War to provide remote sensing of large weapons and production facilities, both for monitoring treaties and providing indications and warnings of
large-scale attacks. Revolutionary space capabilities introduced by electronic
sensors and spaceborne platforms in the 1960s and 1970s moved intelligence
toward a mature sensor-centric emphasis. In the Gulf War, these sensor assets
benefited the United Statesled coalition on the battlefield, providing unprecedented surveillance and targeting. In that sensor-centric world of the early
1990s, information superiority required sensing coverage and the key technologies were global sensors.
But the Gulf War also pointed out a weakness in the ability to reap the
benefits of global sensingthe difficulties in developing collaboration between
intelligence and operational communities and the inability to rapidly disseminate knowledge to the warfighter [10]. Since the war, as remote sensing and
global communications has proliferated and become available to all, the information competition has shifted from coverage to speed of access and dissemination. The U.S. defense community has developed a network-centric approach to
intelligence and warfare that utilizes the power of networked information to
enhance the speed of command and the efficiency of operations [11]. Sensors
are linked to shooters, commanders efficiently coordinate agile forces, and
engagements are based on prediction and preemption. The keys to achieving
information superiority in this network-centric model are network breadth (or
connectivity) and bandwidth; the key technology is information networking.
Winning future intelligence competitions, where the conflict space is
global and extends across the physical, symbolic, and cognitive realms, will
require yet a new strategy. The future emphasis will become dependent on
maintaining a knowledge-centric advantage. This is because we are moving into a
world environment where no single player will maintain the sole and significant
margin in global sources of information or in the ability to network information. Global sensing and networking capabilities will become a commodity with
most global competitors at parity. Like the open chess game where everyone sees
all the pieces, the world will be an open chessboard of readily available information accessible by all intelligence competitors. The ability to win will depend
upon the ability to select and convert raw data into accurate decision-making
knowledge. Intelligence superiority will be defined by the ability to make decisions most quickly and effectivelywith the same information available to virtually all parties. The key enabling technology in the next century will become
processing and cognitive power to rapidly and accurately convert data into comprehensive explanations of realitysufficient to make rapid and complex
decisions.
Consider several of the key premises about the significance of knowledge
in this information age that are bringing the importance of intelligence to the
forefront. First, knowledge has become the central resource for competitive
advantage, displacing raw materials, natural resources, capital, and labor. This
resource is central to both wealth creation and warfare waging. Second, the
management of this abstract resource is quite complex; it is more difficult (than
material resources) to value and audit, more difficult to create and exchange,
and much more difficult to protect. Third, the processes for producing knowledge from raw data are as diverse as the manufacturing processes for physical
materials, yet are implemented in the same virtual manufacturing plantthe
Age of Revolution
Nonlinear
Organizational
learning
Continuous
change
Continuous
improvement
Nonlinear
innovation
Knowledge
management
Knowledge
becomes a
commodity
1970s
198090s
Innovate products
and services
Innovate business
processes
At risk:
Twenty-first century
Incumbents
Superpowers
10
In this view, those at greatest risk in this new nonlinear environment are
incumbents (in business) and superpowers (in national security). The U.S.
emphasis on RMA to become innovative and agile is observed in the investments to address asymmetric threats and information warfare. And the exploration of a new network-centric doctrine illustrates the move to restructure the
military to an adaptive warfighting organism that emphasizes networked collaborative knowledge rather than a command hierarchy that emphasizes control of
weaponry [14].
11
A functional taxonomy (Figure 1.2) based on the type of analysis and the
temporal distinction of knowledge and foreknowledge (warning, prediction,
and forecast) distinguishes two primary categories of analysis and five subcategories of intelligence products [16]:
Descriptive analyses provide little or no evaluation or interpretation
(evidence) to infer and synthesize explanations that describe the meaning of the underlying data. We can distinguish four different focuses of
inferential analysis:
1. Analyses that explain past events (How did this happen? Who did
it?);
Intelligence products
Descriptive
analysis
Recording, statistical
No evaluation
Explain past
events
Investigation
Inferential analysis
Evaluation and inferential
analysis to draw conclusions
Describe
structure
(attributes)
Describe
behavior
(states)
Predict
future
events
Command
structure
Tracking
Processes
Forecasts
Foreknowledge
12
13
User
Nation-state
Business
Security
Concerns
Strategic
Sovereignty
Indications
and Warning Operational
(I&W)
and Tactical
Global
Threat event
warning
Political, economic political,
economic, and
stability
military
Treaties, alliances
analysis;
Threats to defined threat
national interests analysis
Competitiveness
Growth
Real and
Intellectual
property
Business alliances
Market
analysis;
competitor
analysis
Diplomatic
support; crisis
support;
military
targeting
Market
Marketing and
discontinuities sales support;
and trends
supply-chain
management;
customer
relations
management
Intelligence
Consumers
National
leaders;
military;
public
Leaders and
management;
operations;
employees
Threats to market
position
14
Nationstate
Own military
force dispositions
Called friendly
force information
(FFI)
Business
Business operations
Customer relations
mgmt
Sales force
automation
Supply chain mgmt
E-Commerce
Business
intelligence
The
environment
(neutral)
Global affairs, foreign
relations, treaties,
economics, politics
Transnationals,
Threats to
security
(adversary)
National
intelligence
Adversary nations
Transnational and
NGO threats
Battlespace factors,
constraints: terrain,
weather, lines of
communication, etc.
Adversary military
threats
The Market
(customers, products
and services, etc.)
Market factors
(economy, season,
sales area, etc.)
Market dynamics
Competitive
landscape
Competitor market
position
Competitor
operations,
products
Military
intelligence
Infrastructure
threats
Competitor
intelligence
that each of the four disciplines, as defined by their users, partition the target
subjects in ways that are not analogous:
1. National intelligence focuses on the understanding of the global environment (political, economic, natural environmental, science, and
technology areas) and its important participants (foreign nation-states
and their political organizations, nongovernmental organizations
[NGOs], and influential individuals).
2. Military intelligence (MI) refers to the intelligence processes that focus
on understanding foreign military threats to provide threat assessments,
I&W, weapons targeting, and damage assessments (in time of conflict).
3. Business intelligence (BI) refers to the acquisition, organization,
analysis, and reporting of internal and external factors to enable
15
National intelligence refers to the strategic knowledge obtained for the leadership of nation-states to maintain national security. National intelligence is
focused on national securityproviding strategic warning of imminent threats,
knowledge on the broad spectrum of threats to national interests, and foreknowledge regarding future threats that may emerge as technologies, economies,
and the global environment changes. National intelligence also supports
national leaders in such areas as foreign policymaking, assessments of global
economies, and validation of treaty compliance by foreign countries.
The term intelligence refers to both a process and its product. The U.S.
Department of Defense (DoD) provides the following product definitions that
are rich in description of the processes involved in producing the product [22]:
1. The product resulting from the collection, processing, integration,
analysis, evaluation, and interpretation of available information concerning foreign countries or areas;
16
17
18
Focus of
Intelligence
Internal
Business operations
Objects (Targets)
Supply chain
of Intelligence
Customer relations
management
Buyers and suppliers
Competitive Intelligence
External:
Neutral Factors
External:
Competitive Factors
Customer structure,
preferences, behaviors
Competitors
Financial environment
Strategic partner
candidates
Regulatory climate
Marketplace environment
Segmentation
Market drivers
Buying patterns
Objective
Uses by
Intelligence
Consumers
Efficiency
Business process
Market dynamics
performance analysis, modeling and forecasting
refinement, and
Market positioning
reengineering
Learning customer trends
Identifying threats,
technology, regulation
Security
Identifying competitor
threats
Tracking and forecasting
competitor actions
Identifying, qualifying
strategic partner
candidates
19
set of rules (i.e., standards, protocols, interfaces, and services) governing the
arrangement, interaction, and interdependence of the elements of the system.
These three views of the enterprise (Figure 1.4) describe three layers of peopleoriented operations, system structure, and procedures (protocols) that must be
defined in order to implement an intelligence enterprise.
The operational layer is the highest (most abstract) description of the concept of operations (CONOPS), human collaboration, and disciplines of the
knowledge organization. The technical architecture layer describes the most
detailed perspective, noting specific technical components and their operations,
protocols, and technologies. In the middle is the system architecture layer,
which defines the network structure of nodes and interconnections. The performance of these layers is quantified by the typical kinds of metrics depicted in
the figure. The intelligence supply chain that describes the flow of data into
knowledge to create consumer value is measured by the value it provides to
intelligence consumers. Measures of human intellectual capital and organizational knowledge describe the intrinsic value of the organization. The distributed computing architecture is measured by a variety of performance-level
metrics that characterize the system capability in terms of information volume,
capacity, and delivery rates. The technical physical (or hardware) and abstract
Architecture view
Enterprise Components
Collection
Operational
architecture
Knowledge-based
organization and operations
Dissemination
Measures
Intelligence supply
chain value added
Consumer operational
utility, effectiveness
Analysis
Goals, measures, workflows,
Financial measures
processes
Virtual team growth
Configuration
System
architecture
Measures of system
performance:
Network infrastructure of
process nodes and links
Transfer rates
Storage capacity
Data, communication, and
No. of users
processing objects
Applications
Technical
Technical parameters;
Storage
architecture
Hardware
Communication
technology
component
Network computing
bandwidth
technologies Storage density
standards, data models, and Network
network protocols
Processing
standards, Processing,
Information and KM
object,
agent,
performance (speed
protocols
technologies
and application
and capacity)
technologies
20
(or software) elements of the enterprise are described by engineering dimensional performance parameters (e.g., bandwidth, storage density, and processing
gain).
Throughout this book, we introduce the KM principles and practice that
allow intelligence officers, enterprise architects, and engineers to implement
these abstract models into a working intelligence enterprise of people and their
processes, systems, and technology.
1.5 The State of the Art and the State of the Intelligence Tradecraft
The subject of intelligence analysis remained largely classified through the
1980s, but the 1990s brought the end of the Cold War and, thus, open publication of the fundamental operations of intelligence and the analytic methods
employed by businesses and nation-states. In that same period, the rise of commercial information sources and systems produced the new disciplines of open
source intelligence (OSINT) and business/competitor intelligence. In each of
these areas, a wealth of resources is available for tracking the rapidly changing
technology state of the art as well as the state of the intelligence tradecraft.
1.5.1
of Intelligence and the Sherman Kent School of Intelligence, unclassified versions are published on the schools Web site (http://odci.
gov.csi), along with periodically issued monographs on technical topics
related to intelligence analysis and tradecraft.
International Journal of Intelligence and CounterintelligenceThis quarterly journal covers the breadth of intelligence interests within law
enforcement, business, nation-state policymaking, and foreign affairs.
Intelligence and National SecurityA quarterly international journal
published by Frank Cass & Co. Ltd., London, this journal covers broad
intelligence topics ranging from policy, operations, users, analysis, and
products to historical accounts and analyses.
Defense Intelligence JournalThis is a quarterly journal published by
21
the U.S. Army Intelligence Center (Ft. Huachuca) that is available online and provides information to military intelligence officers on studies
of past events, operations, processes, military systems, and emerging
research and development.
Janes Intelligence ReviewThis monthly magazine provides open
Several sources focus on the specific areas of business and competitive intelligence with attention to the management, ethical, and technical aspects of collection, analysis, and valuation of products.
Competitive Intelligence MagazineThis is a CI source for general
by John Wiley with the SCIP, contains best-practice case studies as well
as technical and research articles.
Management International ReviewThis is a quarterly refereed journal
22
1.5.3
KM
officers and staff includes articles on KM, best practices, and related
leadership topics.
Harvard Business Review, Sloan Management ReviewThese manage-
executives responsible for leading performance improvement and contributing thought leadership in business. Emphasis areas include KM,
organizational learning, core competencies, and process management.
American Productivity and Quality Center (APQC)THE APQC is a
journal provides technical articles on the theory, techniques, and practice of knowledge extraction from large databases.
International Journal on Multi-Sensor, Multi-Source Information
23
I. Intelligence
and its
applications
Socialization
II. Knowledge
management
processes
III. Intelligence
enterprise
6. Implementing
6. Implementing
6. analysis and
Analysis and
6. synthesis
Synthesis
7. Knowledge
7. internalization
7. Knowledge
7. and externalization
Internalization and
Externalization
Combination
Combination
8. Explicit knowledge
8. capture and
Capture and
8. combination
Combination
9. The9.intelligence
The Intelligence
enterprise
Enterprise
10. Knowledge Management
management technology
Technology
24
Endnotes
[1]
[2]
The United States distinguishes intelligence proper as the service of obtaining and delivering knowledge to government users (consumers); counterintelligence and covert action are
intelligence-related operations. In this book, we do not discuss these secondary intelligencerelated activities.
[3]
[4]
The Greeks distinguished wisdom (sophia) and understanding (sunesis) as the principles by
which we live and the ability to apply those principles in daily life, respectively.
[5]
Tzu, Sun, The Art of War, translated by R. D. Sawyer, Boulder, CO: Westview Press,
1994, p. 231.
[6]
Tzu, Sun, The Art of War, translated by R. D. Sawyer, Boulder, CO: Westview Press, 1994,
pp. 231232. Sun Tzus five categories can be compared to current HUMINT terminology: 1) local spies (agents native to the foreign country), 2) inward spies (foreign agents who
are officials), 3) converted spies (double agentsforeign agents turned to ones use), 4)
doomed spies (ones own expendable agents sent with fabricated intelligence for purposes of
deception), and 5) surviving spies (defectors or those returning with intelligence).
[7]
Relevant Information is comprised of intelligence (information about the operational environment, adversaries, and third parties), friendly force information (information about own
forces), and essential elements of friendly information (specific information about friendly
forces we seek to deny to an adversary). See Field Manual 3-0Operations, Washington,
D.C.: HQ Dept. of U.S. Army, June 2001, Chapter 11: Information Superiority,
accessed on-line at http://www.adtdl.army.mil/cgi-bin/atdl.dll/fm/3-0/toc.htm. The enumeration of intelligence requirements effectively defined the instructions to perform the
process defined in U.S. Army doctrine as Intelligence Preparation of the Battlefield. See
Field Manual 34-130Intelligence Preparation of the Battlefield, Washington, D.C.: HQ
Dept. of U.S. Army, July 1994, accessed on-line at http://www.adtdl.army.mil/
cgi-bin/atdl.dll/fm/34-130/toc.htm.
25
[8] From Presidential Reflections on US Intelligence, CIA Center for the Study of Intelligence, accessed on-line November 2001 at http://www.odci.gov/csi/monograph/firstln/
washington.html.
[9] These concepts are described in, for example: Toffler, A., Third Wave, New York: Bantam, 1991; Toffler, A., and H. Toffler, War and Anti-War, New York: Warner, 1995, and
A. Toffler, PowershiftKnowledge, Wealth and Violence at the Edge of the 21st Century,
New York: Bantam, 1991.
[10] Keaney, T. A., and E. Cohen, Gulf War Air Power Survey Summary Report, Washington D.C.: Government Printing Office, 1993, Chapter 4: What Was the Role of
Intelligence?
[11] Cebrowski, A. K. (VADM, USN), and J. J. Garstka, Network-Centric Warfare: Its Origin
and Future, Naval Institute Proceedings, January 1998, accessed on-line November 2001 at
http://www.usni.org/Proceedings/Articles98/PROcebrowski.htm. See also Alberts, D. S., et
al., Network Centric Warfare: Developing and Leveraging Information Superiority, (2nd ed.),
Washington, D.C.: C4ISR Cooperative Research Program, August 1999.
[12] Hamel, G., Leading the Revolution, Boston: HBS Press, 2000.
[13] Hamel, G., Leading the Revolution, Boston: HBS Press, 2000, p. 18.
[14] Edwards, S. J. A., Swarming on the Battlefield: Past, Present and Future, Santa Monica, CA:
RAND, 2000.
[15] Consumers Guide to Intelligence, Washington, D.C.: CIA, September 1993, updated February 1994.
[16] This taxonomy is based on a categorization in the text: Schum, David, Inference and
Analysis for the Intelligence Analyst, Volumes 1 and 2, Washington D.C.: University Press
of America, 1987.
[17] While national and business intelligence is required to understand and estimate continuous processes in threat and market environments, intelligence analysis in both domains
must also consider discontinuitiessurprises or unexpected emergent behavior in complex
processes. Discontinuities arising from new technologies, cultural shifts, globalization, and
other factors can create radical changes in the threats to nation-states as well as to business.
[18] Definition from Glossary of Competitive Intelligence Terms, Competitive Intelligence
Review, Vol. 9, No. 2, AprilJune 1998, p. 66.
[19] See, for example the argument posed by Stanley Kober in WHY SPY?The Uses and
Misuses of Intelligence, Cato Policy Analysis No. 265, CATO Institute, December 12,
1996, accessed on-line at http://www.cato.org/pubs/pas/pa-265.html.
[20] Development of Surveillance Technology and Risk of Abuse of Economic Information:
An Appraisal of Technologies of Political Control, Working document for the Scientific
and Technological Options Assessment Panel, PE 168.184/Int.St./part 1 of 4, European
Parliament, Luxembourg, May 1999.
[21] For a discussion of the disagreements over and implications of using national intelligence
organizations in support of private-sector economic intelligence, see Greenberg, M. R.,
and R. N. Haas, Making Intelligence Smarter: The Future of U.S. Intelligence, New York:
Council on Foreign Relations, 1996see section entitled, Economic Intelligenceand
26
[22] Joint Publication 1-02, DoD Dictionary of Military and Associated Terms.
[23] Herman, M., Intelligence Power in Peace and War, Cambridge, England: Cambridge University Press, 1996, p. 56.
[24] Libicki, Martin C., Information Dominance in Strategic Forum, Number 132, Institute
for Strategic Studies, National Defense University, Washington D.C., November 1997.
[25] Joint Vision 2020 (JV 20202), U.S. Joint Chiefs of Staff, Department of Defense, 2000.
[26] Robertson, E., Department of the Navy Acquisition Reform FY2000, April 19, 2000,
accessed on-line at http://www.ace.navy.mil/alrweek2000/briefs/jackson/sld001.htm; this
document cites the Computer Sciences Corporation (CSC) corporate knowledge
environment.
[27] DaimlerChrysler Extended Enterprise definition, accessed at http://supplier.chrysler.com/purchasing/extent/index.html.
[28] An architecture is defined in IEEE 610.12 as the structure of components, their relationships, and the principles and guidelines governing their design and evolution over time.
[29] C4ISR Architecture Framework Version 2.0, Office of the Assistant Secretary of Defense for
Command, Control, Communications, and Intelligence, Washington, D.C., November
1997.
Selected Bibliography
The following bibliography identifies major texts in the application areas of
national and military intelligence and business and competitive intelligence.
Consumers Guide to Intelligence, CIA, Washington, D.C., September 1993, updated February 1994.
National and Military Intelligence (texts on intelligence that emphasize the processing, analytic and decision-making support roles of intelligence (rather than collection or covert
action).
Berkowitz, B. D., and A. E. Goodman, Best Truth: Intelligence in the Information Age, New
Haven, CT: Yale University Press, 2000.
Berkowitz, B. D., and A. E. Goodman, Strategic Intelligence for American National Security, Princeton, NJ: Princeton University Press, 1989.
Bozeman, A. B., Strategic Intelligence and Statecraft: Selected Essays, Washington, D.C.:
Brasseys, 1992.
Clark, R., Intelligence Analysis, Baltimore: American Lit Press, 1996.
Codevilla, A., Informing Statecraft: Intelligence for a New Century, NY: The Free Press, 1992.
27
Dulles, A., The Craft of Intelligence New York: Harper & Row, 1963; London: Weidenfeld & Nicolson, 1963; Boulder, CO: Westview, 1985.
Herman, M., Intelligence Power in Peace and War, Cambridge, England: Cambridge University Press, 1996.
Heuer, R., The Psychology of Intelligence Analysis, Washington D.C.: CIA Sherwood Kent
School of Intelligence Analysis, 1999.
Johnson, L., Bombs, Bugs, Drugs, and Thugs: Intelligence and Americas Quest for Security,
New York: New York University Press, 2000.
Krizan, L., Intelligence Essentials for Everyone, Occasional Paper No. 6., Joint Military
Intelligence College, Washington, D.C.: Government Printing Office, 1999.
Steele, R. D., On Intelligence: Spies and Secrecy in an Open World, Washington D.C.:
AFCEA International Press, 2000.
Treverton, G. D., Reshaping National Intelligence for an Age of Information, Cambridge,
England: Cambridge University Press, 2001.
Competitor Intelligence
Gilad, B., and T. Gilad, The Business Intelligence SystemA New Tool for Competitive
Advantage, New York: American Management Association, 1988.
Fuld, L. M., The New Competitor Intelligence, New York: John Wiley & Sons, 1995, and
Competitor Intelligence: How to Get It; How to Use It, New York: John Wiley, 1985.
Hohhof, B., Competitive Information Systems Development, Glastonbury, CT: The Futures
Group, 1993.
Porter, M. E., Competitive Strategy: Techniques for Analyzing Industries and Competitors,
New York: The Free Press, 1980.
Stoffels, J. D., Strategic Issues Management: A Comprehensive Guide to Environmental Scanning, New York: Pergamon, 1993.
Tyson, K. W. M., Competitor Intelligence Manual and Guide: Gathering, Analyzing, and
Using Business Intelligence, Englewood Cliffs, NJ: Prentice Hall) 1990.
Business Intelligence
Harvard Business Review on Knowledge Management, Boston: Harvard Business School
Press, 1998.
Dixon, N. M., Common Knowledge: How Companies Thrive by Sharing What They Know,
Boston: Harvard Business School Press, 2000.
Krogh, G. V., Johan Roos, and Dirk Kleine, Knowing in Firms: Understanding, Managing
and Measuring Knowledge, London: Sage, 1998.
Liebowitz, J., Building Organizational Intelligence: A Knowledge Management Primer, Boca
Raton, FL: CRC Press, 1999.
2
The Intelligence Enterprise
Intelligence, the strategic information and knowledge about an adversary and an
operational environment obtained through observation, investigation, analysis,
or understanding [1], is the product of an enterprise operation that integrates
people and processes in a organizational and networked computing environment. The intelligence enterprise exists to produce intelligence goods and serviceknowledge and foreknowledge to decision- and policy-making customers.
This enterprise is a production organization whose prominent infrastructure is
an information supply chain. As in any business, it has a front office to manage its relations with customers, with the information supply chain in the back
office. The intellectual capital of this enterprise includes sources, methods,
workforce competencies, and the intelligence goods and services produced. As in
virtually no other business, the protection of this capital is paramount, and
therefore security is integrated into every aspect of the enterprise.
In this chapter we introduce the stakeholders, functions, and operations of
the intelligence enterprise. We also examine the future of intelligence and how
the global environment is requiring changes in the organization and operations
of intelligence.
30
Stakeholder roles
Stakeholder structure
DCI
Dept
of
State
FBI
Others
NCA
Dept of
Homeland
Security
CIA
Collectors
J2
Unified commands
Civilian
govt
Analysts
Analysts
Intel
producers Processors
Intelligence utility
NSA
signals
CMO
meas/sig
NIMA
images
CIA
human
DIA
human
Commercial
Open source
Intel
users
(consumers)
National security
Global awareness
Policy impact
Intelligence
contribution to
mission
effectiveness
Accuracy
Coverage breadth
Data depth
Throughput
Access, revisit
Timeliness
Figure 2.1 Structure and metrics of the stakeholders of the U.S. IC.
illustrate the relationships between key stakeholder roles and the metrics by
which these stakeholders value the enterprise:
The owners of the process include the U.S. public and its elected offi-
civilian user agencies that measure value in terms of intelligence contribution to the mission of each organization, measured in terms of its
impact on mission effectiveness.
Intelligence producers, the most direct users of raw intelligence, include
the collectors (HUMINT and technical), processor agencies, and analysts. The principal value metrics of these users are performance based:
information accuracy, coverage breadth and depth, confidence, and
timeliness.
31
The purpose and value chains for intelligence (Figure 2.2) are defined by
the stakeholders to provide a foundation for the development of specific value
measures that assess the contribution of business components to the overall
enterprise. The corresponding chains in the U.S. IC include:
Sourcethe source or basis for defining the purpose of intelligence is
found in the U.S. Constitution, derivative laws (i.e., the National Security Act of 1947, Central Intelligence Agency Act of 1949, National
Security Agency Act of 1959, Foreign Intelligence Surveillance Act of
1978, and Intelligence Organization Act of 1992), and orders of the
executive branch [2]. Derived from this are organizational mission
documents, such as the Director of Central Intelligence (DCI) Strategic
Intent [3], which documents communitywide purpose and vision,
as well as derivative guidance documents prepared by intelligence
providers.
Purpose chainthe causal chain of purposes (objectives) for which the
Source
Purpose chain
U.S.
Constitution
National
Security
Owners,
DCI
beneficiaries, Strategic
consumers Intent
Intelligence Virtual
providers
intelligence
enterprise
Info superiority
(right info, time,
and place)
Unify
community
Maximize
resources
Deliver intel
superiority
products
Value chain
Freedom
Security, unity
product superiority
cost
Schedule-cost
Performance
Interoperability
Survivability
Reliability-availability
Operability
Scalability
Maintainability-testability
Affordability
Marketability
Vulnerability
Adaptability
Feasibility-producibility
Flexibility-customizability
Figure 2.2 Chains of purposes and values for the U.S. IC.
Measures
Agility
Collaboration
Allocation
Confidentiality
Integrity
Availability
Timeliness (latency)
Update rate
Accuracy
Completeness
Relevance
Responsiveness
Coverage
Anticipation of info
needs
Predictive
Push to right user
32
lated by stakeholders and by which the value of the intelligence enterprise is evaluated.
Three major categories of intelligence products can be distinguished: strategic, military-operational, and military-tactical intelligence. Table 2.1 contrasts
the categories, which are complementary and often share the same sources to
deliver their intelligence products. The primary difference in the categories is
the perspective (long- to short-term projection) and the reporting cycle (annual
to near-real-time updates).
Table 2.1
Major Categories of Nation-State Intelligence
Intelligence
Category
Focus (Intel Users)
Understanding of current and
Strategic or
future status and behavior of
National
foreign nations. Estimates of
Intelligence
the state of global activities.
Indications and warnings of
threats.
(National policymakers)
Objects of Analysis
Reporting Cycle
Foreign policy
Political posture
National stability
Socioeconomics
Cultural ideologies
Science and technology
Long-term analyses
(months, years)
Frequent status reports
(weekly, daily)
Foreign relationships
Military strength, intent
Understanding of military
Military
powers, orders of battle,
Operational
technology maturity, and
Intelligence
future potential.
(Military commanders)
Orders of battle
Military doctrine
Science and technology
Command structure
Force strength
Force status, intent
Real-time understanding of
Military platforms
Military
military units, force structure, Military units
Tactical
and active behavior (current
Intelligence
Force operations
and future) on the battlefield.
Courses of action (past,
(Warfighters)
current, potential future)
33
rates the market served and the vision for the businesses role in that
market.
Purpose chainthe objectives of the business require knowledge about
internal operations and the market (BI objectives) as well as competitors (CI).
Value chainthe chain of values (goals) by which achievement of the
set of measures includes vision and strategy, customer, internal, financial, and learning-growth metrics.
34
Intelligence
products
Finished
intelligence
4.
Analysis-synthesis
production
5.
Dissemination
Intelligence
cycle
Intelligence
consumers
Consumer
requirements
11.
Planning and
direction
Plans
Processed
information
2
2.
Collection
33.
Processing
Raw
intelligence
data
Figure 2.3 The intelligence cycle delivers reports in response to specific requests and queries for knowledge needed to make decisions and set policies.
These sources and methods are among the most fragile [5]and most
highly protectedelements of the process. Sensitive and specially
compartmented collection capabilities that are particularly fragile exist
across all of the collection disciplines.
3. Processing. The collected data is processed (e.g., machine translation,
foreign language translation, or decryption), indexed, and organized in
an information base. Progress on meeting the requirements of the collection plan is monitored and the tasking may be refined on the basis
of received data.
4. All-source analysis-synthesis and production. The organized information
base is processed using estimation and inferential (reasoning) techniques that combine all-source data in an attempt to answer the
requestors questions. The data is analyzed (broken into components
and studied) and solutions are synthesized (constructed from the accumulating evidence). The topics or subjects (intelligence targets) of
study are modeled, and requests for additional collection and processing may be made to acquire sufficient data and achieve a sufficient
35
36
Source
Access Type
Open
Human
and
technical
means
Intelligence
Source
Category
Representative Sources
Open source
intelligence
(OSINT)
Human
means
Human
intelligence
(HUMINT)
Closed Technical
Sources means
Imagery
intelligence
(IMINT)
Signals
intelligence
(SIGINT)
MeasureTechnical
ments and
Closed
signatures
Sources means
intelligence
(MASINT)
37
HUMINT Collection
HUMINT refers to all information obtained directly from human sources [14].
HUMINT sources may be overt or covert (clandestine); the most common categories include:
Clandestine intelligence case officers. These officers are own-country indi-
38
Technical collection is performed by a variety of electronic (e.g., electromechanical, electro-optical, or bioelectronic) sensors placed on platforms in space,
the atmosphere, on the ground, and at sea to measure physical phenomena
(observables) related to the subjects of interest (intelligence targets). A wide variety of sensor-platform combinations (Table 2.3) collect data that may be used
39
IMINT
SIGINT
MASINT
Space
Spaceborne radar
(MTI or target
tracking modes)
surveillance
Weather satellites
SIGINT ferrets
IR missile warning/
tracking
Geostationary
spacecraft
Polar orbital
spacecraft
Nuclear detection
Airborne
SIGINT standoff
and penetrating
UAVs
Ground
Ground-based
ESM sites and
vehicles
Combat tactical
digital cameras
Sea, Undersea
Ship and
submarine longrange IR/EO video
Shipboard sensors
Submarine sensors
Ship/submarine
towed sensors
Heliborne dipping,
air dropped sensors
Fixed, autonomous
buoys
Underwater arrays
Recce: Reconnaissance
Shipboard and
submarine air,
surface
surveillance radar
Unattended
electronic
IR night vision
support
IR search and track
measures
(ESM) sensors
Seismic arrays
Acoustic arrays
IR radiometers
40
predictions;
Stealththe degree of secrecy with which the information is gathered
41
Target
process
model
Payment, order
Producer cycle
Bulk
Process, pack
Transport
Street distribution
Delivery
Parcels
Facilities
Processing
Crops
Delivery events
Bank transfers
Local storage
Arrival call
Crop
signature
Account
traffic
Handler call
Observable
phenomena
Ship traffic
Facility
Trucking
Workers
HUMINT B
IMINT a
Collection
tasking
HUMINT A
Data
level:
All-source
analysis
process
Grow,
produce
Launder $,
finance
Transport traffic
Items of
activity
Harvest
IMINT b
SIGINT
OSINT
Information level:
Process analysis
Knowledge level:
Retasking
Intelligence report
Organization, process
Capacity, volume
Vulnerabilities, projections
Figure 2.4 Target process modeling, decomposition, collection planning, and composition
(all-source analysis) for a hypothetical surveillance and analysis of a drug
operation.
The observable phenomena from each of these events are identified and
assigned to technical (and, in this case, HUMINT) collectors. The collectors
include OSINT (shipping traffic logs), ground-based video surveillance of shipping depots (IMINT a), airborne IMINTobserving crop activities and potential processing facilities (IMINT b), and SIGINT analysis of electronic transfers
of funds via court-authorized intercepts.
The example illustrates the complementary nature of HUMINT and technical sources, whereby two HUMINT sources are required to guide the
42
Foreign
language
translation
Threshold
alerts
Trend
and
change
analysis
Current
trends
Data base
populate
Lexicon
and
thesaurus
extraction
Statistical
analysis
Clustering
and linking
of related
data
Data
indexing
Analyst
perception
Data
visualization
Collaborative
analysis
Interaction
Topic 3
Topic 2
Topic 1
Modeling
and
simulation
judgement
analysis
Visuals
Interactive
search and Sort,
collate,
retrieval
list and
Structured cluster
Exploitation (analysis)
Understand the meaning of information
Production base
Preanalysis (organize)
Organize and analyze incoming data
Analysis base
Figure 2.5 Intelligence processing and analysis flow includes three distinct phases to develop the production intelligence base.
Foreign
language
data
Image
text
extraction
Maps
images
pictures
Natural
language
processing
Convert
audio
video
signals
News
radio
TV
Messages
Boots
literature
papers
44
tent, topic, or related topics using the lexicon and thesaurus subjects.
Structured judgment analysis tools provide visual methods to link data,
synthesize deductive logic structures, and visualize complex relationships between data sets. These tools enable the analyst to hypothesize,
explore, and discover subtle patterns and relationships in large data volumesknowledge that can be discerned only when all sources are
viewed in a common context.
Modeling and simulation tools model hypothetical activities, allowing
45
Focusing Analysis-Synthesis
An independent study [21] of U.S. intelligence recommended a need for intelligence to sharpen the focus of analysis-synthesis resources to deal with the
increased demands by policymakers for knowledge on a wider ranges of topics,
the growing breadth of secret and open sources, and the availability of commercial open-source analysis. The study offered several recommendations for
analysis-synthesis [22]:
Retain the focus of critical national and military intelligence analytic
resources on the most crucial national security threats and hard targets
46
One assessment conducted by the U.S. Congress reviewed the role of analysissynthesis and the changes necessary for the community to reengineer its
processes from a Cold War to a global awareness focus. Emphasizing the crucial
role of analysis, the commission noted:
The raison detre of the Intelligence Community is to provide accurate and
meaningful information and insights to consumers in a form they can use at
the time they need them. If intelligence fails to do that, it fails altogether.
The expense and effort invested in collecting and processing the information have gone for naught [23].
47
48
works, tools, and soft-copy products has permitted less formal interaction and more frequent exchange between consumers and producers.
This allows intelligence producers to better understand consumer needs
and decision criteria. This has enabled the production of more focused,
timely intelligence.
Analytic expertise. Enhancements in analytic training and the increased
49
infrastructures;
Traditional concerns regarding fragile states in volatile regions, failing
50
Characteristics of
Intelligence
New Focus
Centralized Intelligence
Distributed Intelligence
Intelligence management on
tactical, operational, measurable
objectives
Intelligence management on
strategic, anticipatory, adaptive
objectives
to rapidly reinvent itself to adapt to emergent threats. The U.S. National Strategy for Homeland Security recognizes the need for changes and has recommended
significant policy, organizational, and infrastructure changes in U.S. intelligence
to respond to terrorist threats. The Strategy asserts, The United States will take
every necessary action to avoid being surprised by another terrorist attack. We
must have an intelligence and warning system that can detect terrorist activity
before it manifests itself in an attack so that proper preemptive, preventive, and
protective action can be taken [30]. The following chapters introduce the key
KM practices, systems, and technologies that will enable the kind of intelligence
organizational, operational, and infrastructure capacity and agility necessary to
achieve such objectives.
51
Endnotes
[1] From definition (2) in Joint Pub 1-02.
[2] Executive Order 12333 provides guidelines for the conduct of intelligence activities. The
U.S. Senate Select Committee on Intelligence provides copies of the major laws at its Web
site: http://intelligence.senate.gov/statutes.htm.
[3] The DCI Strategic Intent is a 1998 classified statement of mission and vision for the IC
to provide direction for transformation to a collaborative enterprise with effective application of people, resources, and IT.
[4] The intelligence cycle follows the description of the U.S. CIA; note that the U.S. DoD
Joint Pub 2-0 defines six steps in the cycle by including: 1) planning and direction, 2) collection, 3) processing and exploitation, 4) analysis and production, 5) dissemination and
integration, and 6) evaluation and feedback. See Joint Publication 2-0 Doctrine for Intelligence Support to Joint Operations, March 2000, in particular Chapter 2The Intelligence
Cycle, accessed on-line on October 30, 2002 at http://www.dtic.mil/doctrine/jel/
new_pubs/jp2_0.pdf.
[5] By fragile, we refer to the potential for loss of value if revealed to the subject of surveillance. Even the most sophisticated sources and methods may often be easily defeated by
denial or deception if revealed.
[6] Shulsky, A. N., Silent WarfareUnderstanding the World of Intelligence, second edition,
Washington D.C.: Braseys, pp. 6369.
[7] Interview with Dr. Joseph Markowitz in Open Source Quarterly, Vol. 1, No. 2, pp. 815.
[8] Herman, M., Intelligence Power in Peace and War, Cambridge England, Cambridge University Press, 1996, Chapter 4: Collection Sources.
[9] Computer network operations are comprised of an offensive component (computer network attack), a defensive component (computer network defense), and the intelligence
function (computer network exploitation).
[10] U.S. Congressional Commission, Preparing for the 21st Century: An Appraisal of Intelligence, Washington, D.C.: Government Printing Office, March 1, 1996.
[11] Making Intelligence Smarter, The Future of U.S. Intelligence, Independent Task Force of
Council on Foreign Relations, New York, 1996.
[12] Strategic Assessment: 1996, National Defense University, Washington D.C., 1996, Chapter
6: Intelligence.
[13] Fuld, L. M., The New Competitor Intelligence: The Complete Resource for Finding, Analyzing, and Using Information About Your Competitors, New York: John Wiley and Sons,
1994.
[14] Attachment 3: Sources of Intelligence, A.3.1 Human Intelligence (HUMINT), in USAF
Intelligence Targeting Guide, AF Pamphlet 14-210, February 1, 1998.
[15] Ameringer, C. D., U.S. Foreign Intelligence, Lexington MA: Lexington Books, 1990,
pp. 1314.
52
[16] Holden-Rhodes, J. F., Sharing the Secrets: Open Source Intelligence and the War on Drugs,
Westport, CT: Praeger, 1997.
[17] Preparing US Intelligence for the Information Age, Director Central Intelligence, STIC
95-003, June 1995.
[18] Part II Analytic Tools To Cope with the Open Source Explosion, in Preparing US Intelligence for the Information Age, Director Central Intelligence, STIC 93-007, December
1993, and Part III, Analytic Tools Recommendations for Open Source Information, in
Preparing US Intelligence for the Information Age, Director Central Intelligence, STIC
95-002, April 1995.
[19] A Review of the Intelligence Community, F-1992-02088 CIA, March 19, 1971, sanitized
and downgraded from top secret for public release to The Princeton Collection, May
1998, p. 3.
[20] Committee Findings and Recommendations, U.S. Congress House Intelligence Committee FY-1996 Markup Report, June 1995.
[21] Hedley, Jr., J. H., Checklist for the Future of Intelligence, Institute for the Study of
Diplomacy, Georgetown University, Washington D.C., 1995. See also IC21The Intelligence Community in the 21st Century, U.S. House of Representatives, Permanent Select
Committee on Intelligence, March 4, 1996.
[22] Hedley, Jr., J. H., Checklist for the Future of Intelligence, Institute for the Study of
Diplomacy, Georgetown University, Washington D.C., 1995. See the section entitled,
Sharpening the Focus, accessed on-line at http://sfswww.Georgetown.edu/sfs/programs/isd/files/intell.htm.
[23] Improving Intelligence Analysis, in Preparing for the 21st Century: An Appraisal of U.S.
Intelligence, U.S. Congress Commission on the Roles and Capabilities of the U.S. Intelligence Community, Washington, D.C.: Government Printing Office, March 1, 1996.
[24] Martin, F. T., Top Secret Intranet: How U.S. Intelligence Built IntelinkThe Worlds Largest, Most Secure Network, New York: Prentice Hall, 1998.
[25] Global Trends 2015: A Dialogue about the Future with Non-Government Experts, Washington D.C.: National Intelligence Council, December 2000.
[26] Tenet, G. J., The CIA and Security Challenges of the New Century, International Journal of Intelligence and CounterIntelligence, Vol. 13, No. 2, Summer 2000, p. 138.
[27] Conference on Intelligence in the 21st Century, Priverno, Italy, February 1416, 2001,
accessed on-line at http://future-intel.it/programma.html.
[28] Dumaine, C., Intelligence in the New Millennium, CIA Directorate of Intelligence,
AFCEA Spring Intelligence Conference, April 18, 2001. Table 2.4 is based on this unclassified paper.
[29] For representative viewpoints, see: Medina, C. A., What to Do When Traditional Models Fail, and Ward, S. R., Evolution Beats Revolution, in Studies in Intelligence, Vol. 46,
No. 3, Washington D.C.: CIA, 2002 Unclassified Edition, accessed on-line on October 3,
2002 at http://www.cia.gov/csi/studies/vol46no3/index.html.
53
[30] The White House, The National Strategy for Homeland Security, U.S. Office of Homeland
Security, July 2002, p. viii. See also Intelligence and Warning, pp. 1519, for specific
organizational, infrastructure, and policy changes.
3
Knowledge Management Processes
KM is the term adopted by the business community in the mid 1990s to
describe a wide range of strategies, processes, and disciplines that formalize
and integrate an enterprises approach to organizing and applying its knowledge assets. Some have wondered what is truly new about the concept of managing knowledge. Indeed, many pure knowledge-based organizations
(insurance companies, consultancies, financial management firms, futures
brokers, and of course, intelligence organizations) have long managed
knowledgeand such management processes have been the core competency
of the business.
Several factors distinguish the new strategies that we develop in this
chapterand each of these has key implications for both public and private
intelligence enterprises. The scope of knowledge required by intelligence
organizations has increased in depth and breadth as commerce has networked
global markets and world threats have diversified from a monolithic Cold War
posture. The global reach of networked information, both open and closed
sources, has produced a deluge of datarequiring computing support to help
human analysts sort, locate, and combine specific data elements to provide
rapid, accurate responses to complex problems. Finally, the formality of the
KM field has grown significantly in the past decadedeveloping theories for
valuing, auditing, and managing knowledge as an intellectual asset; strategies
for creating, reusing, and leveraging the knowledge asset; processes for conducting collaborative transactions of knowledge among humans and
machines; and network information technologies for enabling and accelerating
these processes.
55
56
57
Table 3.1
Representative Diversity of KM Definitions
A Sampling of KM Definitions
A conscious strategy of getting the right knowledge to the right people at the right time and helping
people share and put information into action in ways that strive to improve organizational
performance. ODell and Grayson [1]
an emerging discipline that stresses a formalized, integrated approach to managing an
enterprises tangible and intangible information assets. ...KM is a coordinated attempt to tap the
unrealized potential for sharing and reuse that lies in an enterprises collective consciousness.
The Gartner Group [2]
The leveraging of intellectual capital to increase the organizations capacity for collective action
which creates business value.Motorola University [2]
The notion of putting the combined knowledge of the firm at an employees fingertips is the
essence of knowledge management. The basic goal: to take key pieces of data from various sources,
such as groupware, databases, applications and peoples minds, and make them readily available to
users in an organized, logical form that represents knowledge. Sharon Watson [3]
A systematic process for acquiring, creating, integrating, sharing, and using information, insights,
and experiences, to achieve organizational goals. U.S. DoD Rapid Improvement Team for
Acquisition KM [2]
A systematic process for acquiring, creating, integrating, sharing, and using information, insights,
and experiences, to make the right business decisions and achieve organizational goals.
Objectives to:
Facilitate natural communities of practice
Develop an architecture for systematic and integrated knowledge sharing both within and across
communities of practice
Convert knowledge into a usable tool for the acquisition professional,
Provide a disciplined and organized methodology for constant improvement and development of
knowledge domains
All with the goal of encouraging innovation and producing successful results. U.S. Marine Corps
System Commands Rapid Improvement KM Team [2]
Create a capability where the acquisition worker can locate acquisition knowledge on demand, from
any source, at any time, from any location with a high degree of confidence that information is
accurate and relevant. U.S. Navys Acquisition Reform Office Vision for Acquisition Knowledge
Management Systems [2]
to achieve business goals and that knowledgein the minds of its people, embedded in processes, and in explicit representations in knowledge basesmust be regarded as an intellectual form of capital to be
leveraged. Organizational values must be coupled with the growth of
this capital.
58
KM involves a process that, like a supply chain, moves from raw materials
The U.S. DoD has recognized the sharp contrast in the industrial and
knowledge age models of national security (Table 3.2) and the change in perspective from emphasizing weapons and sensor platforms in hierarchies to an
emphasis on a knowledge-based warfighting enterprise operating in networks.
The network-centric model recognizes the enterprise comprised of human
(knowledge) resources, which requires shared knowledge creation, sharing, and
viewing [5]. The DoD has further recognized that KM is the critical enabler for
information superiority:
The ability to achieve and sustain information superiority depends, in large
measure, upon the creation and maintenance of reusable knowledge bases;
the ability to attract, train, and retain a highly skilled work force proficient
in utilizing these knowledge bases; and the development of core business
processes designed to capitalize upon these assets [6].
59
Table 3.2
DoD Contrast in National Security Business Model Perspectives
Industrial Age Model
(Platform Centric)
Producer valued
Customer valued
Individual focus
Enterprise focus
Function-based operations
Process-based operations
Local view
Global view
Systems-thinking approach to
increasing productivity and profits
Individual responsibility
Shared responsibility
Scarce resources
Span of control
Span of influence
60
Impact on actions
Operations
Establishes a
framework for
Decision making
More complete
decision making
judgment
Better selection
of goals
Provides Enables
a basis for dynamic
modeling
Enables
estimates
Understand
data-to-situation
relationships
Provides
necessary and
sufficient data
Knowledge creating
Tailors
Sets
context for
Accumulates to
More complete
understanding
of situation
Forecasts of
outcomes and
consequences
Enables
predictions
Comprehensive
experience, learning
Optimum, proactive
decisions
Effects and
outcome-based
decisions
Rapid decision
response time
Guides
context of
interpretation
Shared
operating
picture
Enables
agile focus of
attention to
Focus on
most critical
issues
Self-synchronizing,
adaptive and
responsive community
Situation awareness
Exchange of
experience and
group learning
Organizational agility
Guides
search and
acquisition
Acquire most
critical data
61
Strategy
1. Dynamic
Knowledge
remain aware of
the situation and
acquire the right
data
Knowledge
Enhancements
Business Intelligence
Enhancements
Military Intelligence
Enhancements
Statistical sampling
Intelligence data
warehousing
2. Support
Critical,
Systems
Thinking
provide aids to
perception and
decision
Support exploratory
thinking of alternative
hypotheses, future
courses of action
(options assessment),
and consequences
3. Shared
Operating
Picture
distribute and
apply the
knowledge
effectively
Electronic mail
4. Focus
Knowledge
Creation
optimize the
information
supply chain
Data warehousing
5. Protection
of Intellectual
Capital
ensure the
protection of
information
Industrial information
security (INFOSEC)
Military INFOSEC
Database backup
Key distribution
Commercial encryption
Encryption
Internet security
(firewalls, encryption)
Intrusion detection
forecasting aids
Cost-risk analysis
Collaborative electronic
interaction tools
Multiple-access business
database
Data fusion
Data mining
Statistical process
control
E-mail security
Intelligence distribution
(intelligence links)
Collaborative decision
making aids
Real-time common
operating picture (COP)
Sensor system refinements
in coverage, detection,
precision, revisit rate, and
dwell
Multisensor coverage
Operational security
62
Level of abstraction
Process flow
Application
Wisdom
Knowledge effectively The process of applying knowledge
to effectively implement a plan or
applied
action to achieve a desired goal or
end state
Knowledge
Information
understood and
explained
Understanding
The process of comprehending
static and dynamic relationships
between sets of information and
Information
the process of synthesizing models
Data placed in
to explain those relationships
context, indexed,
and organized
Data
Measurements and
observations
Physical
process
Organization
The process of aligning,
transforming, filtering, sorting,
indexing, and storing data elements
in relational context for subsequent
retrieval
Observation
The process of collecting, tagging,
and dispatching quantitative
measurements to appropriate
processing
Explicit
processes
Implicit
processes
Leadership
Goal setting
Judgment: decision making
Reasoning; inference
Induction
Deduction
Abduction
Sensemaking
Valuation
Meaning
creation
Uncertainty management
Alignment
Correlation and
association
Extrapolation
Deconflicting
Preprocessing
Calibration
Filtering
Indexing
Sensing
Collection
Measurement
Message parsing
Data acquisition
Ideation
Metaphor
creation
Experience
matching
Orienting
Sorting
Observing
Experiencing
63
Knowledge As Object
The most common understanding of knowledge is as an objectthe accumulation of things perceived, discovered, or learned. From this perspective, data (raw
measurements or observations), information (data organized, related, and placed
in context), and knowledge (information explained and the underlying processes
understood) are also objects. The KM field has adopted two basic distinctions in
the categories of knowledge as object [7]:
1. Explicit knowledge. This is the better known form of knowledge that
has been captured and codified in abstract human symbols (e.g.,
mathematics, logical propositions, and structured and natural language). It is tangible, external (to the human), and logical. This documented knowledge can be stored, repeated, and taught by books
because it is impersonal and universal. It is the basis for logical reasoning and, most important of all, it enables knowledge to be communicated electronically and reasoning processes to be automated. The
development of language, logic, and mathematics has enabled scientific data to be captured, human thought to be recorded, and each to
be logically analyzed external to mind. Newspapers and novels,
HTML content, scientific data, and engineering data all convey
explicit knowledge that can be stored, retrieved, and analyzed.
2. Tacit knowledge. This is the intangible, internal, experiential, and intuitive knowledge that is undocumented and maintained in the human
mind. It is a personal knowledge contained in human experience. Philosopher Michael Polanyi pioneered the description of such knowledge
in the 1950s, considering the results of Gestalt psychology and the
philosophic conflict between moral conscience and scientific skepticism. In The Tacit Dimension [8], he describes a kind of knowledge that
we cannot tell. This tacit knowledge is characterized by intangible factors such as perception, belief, values, skill, gut feel, intuition,
know-how, or instinct; this knowledge is unconsciously internalized
and cannot be explicitly described (or captured) without effort. Polanyi
described perception as the most impoverished form of tacit knowing [9], and he asserted that there exist higher creative forms of tacit
knowing. This kind of knowledge forms the bridge between perception
and the higher forms of (conscious) reasoning that we can tell about
more easily. This is the personal knowledge that is learned by experience, honed as a skill, and often applied subconsciously.
These two forms can be contrasted (Table 3.4) as two means of knowledge
representations as well as two modes of human thought. Some have described
64
Knowledge Constructs
and Modes of Human
Thought
Explict
Tacit
Historical Basis
Pascal (Penses)
(metaphysics/the mind)
Knowledge Exchange
Knowledge Description
Knowledge Presenter
Protection
65
Devlin contrasts the current and promising new analytic techniques for
explicit and tacit representations of both mind and knowledge (Table 3.5). An
understanding of the relationship between knowledge and mind is of particular
interest to the intelligence discipline, because these analytic techniques will serve
two purposes:
1. Mind as knowledge manager. Understanding of the processes of
exchanging tacit and explicit knowledge will, of course, aid the KM
process itself. This understanding will enhance the efficient exchange
of knowledge between mind and computerbetween internal and
external representations.
2. Mind as intelligence target. Understanding of the complete human
processes of reasoning (explicit logical thought) and sensemaking
(tacit, emotional insight) will enable more representative modeling of
adversarial thought processes. This is required to understand the
human mind as an intelligence targetrepresenting perceptions,
beliefs, motives, and intentions [13]. (In Section 5.5, intelligence
applications of mental models are described more fully.)
66
Approaches to
Understand the
Human Mind
Basis of
Approach
Mind
Cartesian Emphasis
(Explicit Representation)
Pascals Emphasis
(Tacit Representation)
Reductionism
Holism
Knowledge
Elements of
Approach to
Analysis
1. Cognitionstudy of mind as
rational rule execution, context-free
algebra of thought independent of
body (Descartes)
67
Davidow and Malone have categorized knowledge, both tacit and explicit,
in the Virtual Corporation in four general classes based on the way in which the
knowledge is applied (Table 3.6) [14]. The categories move from explicit static
and dynamic descriptions (models and simulations, respectively [15]) to more
tacit representations that are actionable. These categories are helpful to distinguish the movement from explicit representations (data and information)
toward mixed tacit and explicit knowledge, which leads to action. Behavioral
knowledge, for example, can be represented in explicit simulations that immerse
the analyst in an effort to provide tacit experience with the dynamics of the
simulated environment. The basis of the simulation may be both content and
Table 3.6
Categories of Knowledge in Business and National Intelligence
Category
Explicit Content
Information
Explicit Form
Information
Level of
Understanding
Historical record
describing the existence,
location, and state of
physical items
(inventory) and abstract
entities (accounts)
Inventory of materials,
products
Force inventory
Orders, personnel
records, intelligence
reports
Product description
Expert judgment of
executive officers
Orders of battle
Target models of
discriminants for
automatic target
recognition (ATR)
Force model descriptions
Skills and expertise of
experienced analysts
Economic models
Engineering simulations Weapon simulations
Market dynamic models Battle management
simulation tools
Industrial robotics
Expert judgment of
senior intelligence
officers
Alternative outcomes
decision aids
Automated stock
trading
Automated sensor
management
68
form information about the environment. The result is both tacit and explicit
actionable knowledge: insight into alternatives and consequences, risks and payoffs, and areas of uncertainty. All of these form a sound basis for judgment to
take action.
Table 3.6 provides representative examples of each category for business
and intelligence applications.
Previously, we have used the terms resource and asset to describe knowledge, but it is not only an object or a commodity to be managed. Knowledge
can also be viewed as a dynamic, embedded in processes that lead to action. In
the next section, we explore this complementary perspective of knowledge.
3.2.2
Knowledge As Process
Knowledge can also be viewed as the action, or dynamic process of creation, that
proceeds from unstructured content to structured understanding. This perspective considers knowledge as actionas knowing. Because knowledge explains
the basis for information, it relates static information to a dynamic reality.
Knowing is uniquely tied to the creation of meaning.
The knowing processes, both explicit and tacit, move from components
(data) toward integrated understanding of meaningrelating the abstractions
of knowledge to the real world (Figure 3.3). The two paths, though separate columns in the table, are not independent but are interactive. (Polanyi believed
that all explicit knowledge, or its interpretation, is rooted in tacit knowledge.)
The explicit knowing process is referred to as reasoning; as described earlier, it is attributed to the Western emphasis on logic, reductionism, and dualism. This knowing process emphasizes the abstraction of truth in the intellect of
Knowledge
form and
process
Explicit
reasoning
Intellect
Hypotheses
Explanations
Beliefs
Education
Tacit
sense making
Insight, sense
Imagination
Understandings
Perceptions
Experience
Contribution
of the
abstraction
Meaning:
relative to
reality action:
basis of action
in reality
Information
Relationships,
links, indexed
data
Images
Metaphors
Ideas
Context:
relative to each
other
Data
Text
Symbolic
Numeric
Experiences
Feelings
Emotions
Content:
independent
abstractions
Knowledge
69
the individual. By contrast, the tacit knowing process has been called sensemaking, a more holistic form of knowing more closely related to the Eastern emphasis on holistic intuition and oneness. In contrast to dualism of mind-body, this
view emphasizes humanity-nature oneness (and therefore mind-body and selfother oneness). This knowing process focuses on the action of truth in the
character of the individual.
Karl Weick introduced the term sensemaking to describe the tacit knowing
process of retrospective rationalitythe method by which individuals and
organizations seek to rationally account for things by going back in time to
structure events and explanations holistically [16]. We do this, to make sense
of reality, as we perceive it, and create a base of experience, shared meaning, and
understanding.
To model and manage the knowing process of an organization requires
attention to both of these aspects of knowledgeone perspective emphasizing
cognition, the other emphasizing culture and context. The general knowing
process includes four basic phases that can be described in process terms that
apply to tacit and explicit knowledge, in human and computer terms, respectively (Figure 3.4):
1. Acquisition. This process acquires knowledge by accumulating data
through human observation and experience or technical sensing and
measurement. The capture of e-mail discussion threads, point-of-sales
transactions, or other business data, as well as digital imaging or signals
analysis are but examples of the wide diversity of acquisition methods.
2. Maintenance. Acquired explicit data is represented in a standard form,
organized, and stored for subsequent analysis and application in digital
databases. Tacit knowledge is stored by humans as experience, skill, or
expertise, though it can be elicited and converted to explicit form in
terms of accounts, stories (rich explanations), procedures, or
explanations.
3. Transformation. The conversion of data to knowledge and knowledge
from one form to another is the creative stage of KM. This
knowledge-creation stage involves more complex processes like internalization, intuition, and conceptualization (for internal tacit knowledge) and correlation and analytic-synthetic reasoning (for explicit
knowledge). In the next subsection, this process is described in greater
detail.
4. Transfer. The distribution of acquired and created knowledge across
the enterprise is the fourth phase. Tacit distribution includes the sharing of experiences, collaboration, stories, demonstrations, and
hands-on training. Explicit knowledge is distributed by mathematical,
70
Acquire
knowledge
Maintain
knowledge
Tacit
Listen
Remember
knowledge
Experience Recall
functions
Capture
(human terms) Observe
Collect
Explicit
knowledge
functions
(computer
terms)
Davenport and
Prusak
model
(organization
terms)
Sense
Data entry
Measure
Capture
Collect
Store
Catalog,
index
Update,
refresh
Search
Protect
Generation
Acquire, rent experts
Dedicate resources,
team
Fusioncombine
diversity
Adaptation to crisis
Networking of people
Transform
knowledge
Socialize
Internalize, feel
Develop intuition
Envision,
conceptualize
Self-organize,
create
Storytelling,
exchange
Internalize
Externalize
Appraise, assess,
evaluate
Validate, verify
Correlate, link
Exploit, analyze or
reason (deduce,
abduce, induce)
Synthesize,
abstract
Compile
Combine
Codification
and coordination
Mapping
(structure)
Modeling
(dynamics)
Narrating
(storytelling)
Transfer
knowledge
Share
Demonstrate
Share experience
Emulate
Disseminate
Exchange
Collaborate
Control workflow
Push (subscription,
broadcast)
Pull (requisition)
Transfer
Exchange people
Mentor
Transfer and absorb
Reduce friction,
enhance trust
71
A widely adopted and insightful model of the processes of creating and exchanging knowledge, or knowledge conversion, within an organization was developed, by Ikujiro Nonaka and Hirotaka Takeuchi in The Knowledge-Creating
Company [18]. The model is very helpful in understanding how analysts interact
with computer support (automation) to create intelligence within the intelligence organizations. Nonaka and Takeuchi describe four modes of conversion,
derived from the possible exchanges between two knowledge types (Figure 3.5):
1. Tacit to tacitsocialization. Through social interactions, individuals
within the organization exchange experiences and mental models,
transferring the know-how of skills and expertise. The primary form of
transfer is narrativestorytellingin which rich context is conveyed
and subjective understanding is compared, reexperienced, and internalized. Classroom training, simulation, observation, mentoring, and
on-the-job training (practice) build experience; moreover, these activities also build teams that develop shared experience, vision, and values.
The socialization process also allows consumers and producers to share
tacit knowledge about needs and capabilities, respectively.
2. Tacit to explicitexternalization. The articulation and explicit codification of tacit knowledge moves it from the internal to external. This
can be done by capturing narration in writing, and then moving to the
72
Transfer to
Tacit
Explicit
Externalization
Socialization
Explicit
6
5
Transfer from
Tacit-to-tacit
socialization
Tacit-to-explicit
externalization
Tacit
7
Internalization
Explicit-to-tacit
internalization
Combination
Explicit-to-explicit
combination
construction of metaphors, analogies, and ultimately models. Externalization is the creative mode where experience and concept are
expressed in explicit conceptsand the effort to express is in itself a creative act. (This mode is found in the creative phase of writing, invention, scientific discovery, and, for the intelligence analyst, hypothesis
creation.)
3. Explicit to explicitcombination. Once explicitly represented, different
objects of knowledge can be characterized, indexed, correlated, and
combined. This process can be performed by humans or computers
and can take on many forms. Intelligence analysts compare multiple
accounts, cable reports, and intelligence reports regarding a common
subject to derive a combined analysis. Military surveillance systems
combine (or fuse) observations from multiple sensors and HUMINT
reports to derive aggregate force estimates. Market analysts search
(mine) sales databases for patterns of behavior that indicate emerging
purchasing trends. Business developers combine market analyses,
research and development results, and cost analyses to create strategic
plans. These examples illustrate the diversity of the combination
processes that combine explicit knowledge.
4. Explicit to tacitinternalization. Individuals and organizations internalize knowledge by hands-on experience in applying the results of
combination. Combined knowledge is tested, evaluated, and results
73
in new tacit experience. New skills and expertise are developed and
integrated into the tacit knowledge of individuals and teams.
Nonaka and Takeuchi further showed how these four modes of conversion
operate in an unending spiral sequence to create and transfer knowledge
throughout the organization (Figure 3.5). The internalization mode naturally
leads to further socialization, and process leads to further tacit sharing, creativity, and knowledge expansion. The spiral model represents the concept of an
ever-learning organization, expanding in knowledge and the application of that
knowledge to the dynamic business environment.
Based on this model, Nonaka and Takeuchi identified five enabling conditions that promote creation within the spiral (Table 3.7). These conditions promote the cohesion of organizational purpose, freedom of thought, and breadth
of perspective necessary to permit the organization to transfer knowledge
between tacit and explicit forms and to explore new perspectives without
boundaries. These conditions can best be seen in small teams (e.g., intelligenceanalysis teams, crisis-analysis teams, and decision-making teams), although they
apply across large organizations. Intention (also described as shared vision and
commitment in numerous management texts) provides organizational cohesion
of purpose and reduces the friction from competitions for different objectives.
Autonomous teams are given the freedom to explore alternative solutions
beyond current mindsets; access to information (i.e., people, databases, and
processes) is not restricted. Organizations that have redundancy of information
(in people, processes, and databases) and diversity in their makeup (also in people, processes, and databases) will enhance the ability to move along the spiral.
The modes of activity benefit from a diversity of people: socialization requires
some who are stronger in dialogue to elicit tacit knowledge from the team; externalization requires others who are skilled in representing knowledge in explicit
forms; and internalization benefits from those who experiment, test ideas, and
learn from experience, with the new concepts or hypotheses arising from combination. These redundancies and diversities also apply to processes and information sources, which provide different perspectives in each stage of the spiral.
Organizations can also benefit from creative chaoschanges that punctuate states of organizational equilibrium. These states include static presumptions, entrenched mindsets, and established processes that may have lost validity
in a changing environment. Rather than destabilizing the organization, the
injection of appropriate chaos can bring new-perspective reflection, reassessment, and renewal of purpose. Such change can restart tacit-explicit knowledge
exchange, where the equilibrium has brought it to a halt.
Underlying this model is Nonaka and Takeuchis important assertion that
the basis of this creative process is the tacit knowledge of individuals:
74
Enabling
Condition
Intention
Condition Definition
Autonomy
Redundancy
Requisite
Variety
Creative
Chaos
Individual liberty of
team members in
thought, exploration,
and action
Internal overlapping of
information about
activities,
responsibility, and
purpose.
Introduction of actions
to stimulate beneficial
interaction between
the organization and
its environment
75
how the spiral operates in an intelligence environment, we follow a future fictional, yet representative, crisis situation in which U.S. intelligence is confronted
with a crisis in a failing nation-state that threatens U.S. national interests. We
follow a distributed crisis intelligence cell, using networked collaboration tools,
through one complete spiral cycle to illustrate the spiral. This case is deliberately
chosen because it stresses the spiral (no face-to-face interaction by the necessarily
distributed team, very short time to interact, the temporary nature of the team,
and no common organizational membership), yet illustrates clearly the phases
of tacit-explicit exchange and the practical insight into actual intelligenceanalysis activities provided by the model.
3.3.1
The Situation
The crisis in small but strategic Kryptania emerged rapidly. Vital national interestssecurity of U.S. citizens, U.S. companies and facilities, and the stability of
the fledgling democratic statewere at stake. Subtle but cascading effects in the
environment, economy, and political domains triggered the small political liberation front (PLF) to initiate overt acts of terrorism against U.S. citizens, facilities, and embassies in the region while seeking to overthrow the fledgling
democratic government. The PLF waged information operationsspreading
rumors via e-mail, roaming AM radio broadcasts, and publishing black propaganda on the Internet. The PLF also corrupted Kryptanian government information systems to support false claims of political corruption. A crisis
intelligence analysis cell is rapidly formed, comprised of the following globally
distributed participants:
Five intelligence officers in Washington, D.C., including a team leader
and four analysts with experience in the country and language skills;
Six political scientists with expertise in Kryptania in four universities
Kryptania;
Six Kryptanian government security officials in Kryptania.
The crisis team is formed and all participants are notified and issued public/
private keys (at their appropriate access levels) to crisis collaboration portals/
collaboration workspaces on computer networks. The first portal is a secure collaborative workspace (a specially secured virtual private network on the Internet)
for sensitive but unclassified (SBU) information access by the academics and
76
77
7. Analytic tools that can be accessed and applied to the group or individual data.
The team composition includes a diverse mix of intelligence officers,
trusted academics, and Kryptanian government officials (requisite variety and
redundancy within the limits of security), along with a common vision to
understand and mitigate the threat. The team is provided a loose charter to
identify specific threat patterns, organizations, and actions (autonomy); the current crisis provides all the creative chaos necessary for the newly formed team.
This first spiral of knowledge creation (Figure 3.6) occurs within the first
several days of the teams formation.
3.3.2
Socialization
Crisis
Crisis team
define problem
apply experience
learn
Enter filters,
search and retrieval
keys for needed
data
Knowledgebase models
4
4
View, conceive,
and understand
explanations and
patterns
3
3
Automated
combination and
analysis of data
in databases
78
workspace created for the team. The team leader briefs the current situation and
the issues: areas of uncertainly, gaps in knowledge or collection, needs for information, and possible courses of events that must be better understood. The
group is allowed time to exchange views and form their own subgroups on areas
of contribution that each individual can bring to the problem. Individuals
express concepts for new sources for collection and methods of analysis. In this
phase, the dialogue of the team, even though not face to face, is invaluable in
rapidly establishing trust and a shared vision for the critical task over the ensuing
weeks of the crisis. Over the course of the next day, several total-group and subgroup teleconferences sustain the dialogue and begin to allow the members to
exchange tacit perspectives of Kryptania, approaches to understanding the
threats, and impressions of where the crisis might lead. This process of dialogue
exposes the diversity of mental models about the threat and even the different
interpretations of the groups charter (the organizational intention). As this happens, the team begins to request additional source or types of information on the
portal and starts to record requests, impressions, needed actions, and needs for
charter clarifications (questions about the boundaries of the problem and
restrictions on access) on the bulletin board.
3.3.3
Externalization
The initial discussions lead to the creation of initial explicit models of the threat
that are developed by various team members and posted on the portal for all to
see, including:
1. Structure charts of the PLF and possible linked financial supporters
and organized crime operations;
2. Lists of likely sources of the black propaganda;
3. Map of Kryptania showing cities of greatest influence by the PLF and
supporters;
4. Time history of PLF propaganda themes and terrorist activities;
5. Causal chains of past actions and hypotheses of possible future course
of FLP actions.
The team collaboratively reviews and refines these models by updating
new versions (annotated by contributors) and suggesting new submodels (or
linking these models into supermodels). This externalization process codifies the
teams knowledge (beliefs) and speculations (to be evaluated) about the threat.
Once externalized, the team can apply the analytic tools on the portal to search
for data, link evidence, and construct hypothesis structures. The process also
allows the team to draw on support from resources outside the team to conduct
79
Combination
The codified models become archetypes that represent current thinkingcurrent prototype hypotheses formed by the group about the threat (whotheir
makeup; whytheir perceptions, beliefs, intents, and timescales; whattheir
resources, constraints and limitations, capacity, feasible plans, alternative
courses of action, vulnerabilities). This prototype-building process requires the
group to structure its arguments about the hypotheses and combine evidence to
support its claims. The explicit evidence models are combined into higher level
explicit explanations of threat composition, capacity, and behavioral patterns.
Initial (tentative) intelligence products are forming in this phase, and the team
begins to articulate these prototype productsresulting in alternative hypotheses and even recommended courses of action for the United States and
Kryptania.
3.3.5
Internalization
As the evidentiary and explanatory models are developed on the portal, the team
members discuss (and argue) over the details, internally struggling with acceptance or rejection of the validity of the various hypotheses. Individual team
members search for confirming or refuting evidence in their own areas of expertise and discuss the hypotheses with others on the team or colleagues in their
domain of expertise (often expressing them in the form of stories or metaphors)
to experience support or refutation. This process allows the members to further
refine and develop internal belief and confidence in the predictive aspects of the
models. As accumulating evidence over the ensuing days strengthens (or refutes)
the hypotheses, the process continues to internalize those explanations that the
team has developed that are most accurate; they also internalize confidence in
the sources and collaborative processes that were most productive for this
ramp-up phase of the crisis situation.
3.3.6
Socialization
As the group periodically reconvenes, the subject focuses away from what we
must do to the evidentiary and explanatory models that have been produced.
The dialogue turns from issues of startup processes to model-refinement
processes. The group now socializes around a new level of the problem: Gaps in
the models, new problems revealed by the models, and changes in the evolving
crisis move the spiral toward new challenges to create knowledge about
80
Summary
This example illustrates the emergent processes of knowledge creation over the
several day ramp-up period of a distributed crisis intelligence team. The full spiral moved from team members socializing to exchange the tacit knowledge of
the situation toward the development of explicit representations of their tacit
knowledge. These explicit models allowed other supporting resources to be
applied (analysts external to the group and on-line analytic tools) to link further
evidence to the models and structure arguments for (or against) the models. As
the models developed, team members discussed, challenged, and internalized
their understanding of the abstractions, developing confidence and hands-on
experience as they tested them against emerging reports and discussed them with
team members and colleagues. The confidence and internalized understanding
then led to a drive for further dialogueinitializing a second cycle of the spiral.
3.4 Taxonomy of KM
Using the fundamental tacit-explicit distinctions, and the conversion processes
of socialization, externalization, internalization, and combination, we can establish a helpful taxonomy of the processes, disciplines, and technologies of the
broad KM field applied to the intelligence enterprise. A basic taxonomy that
categorizes the breadth of the KM field (Table 3.8) can be developed by distinguishing three areas of distinct (though very related) activities:
1. People. The foremost area of KM emphasis is on the development of
intellectual capital by people and the application of that knowledge by
those people. The principal knowledge-conversion process in this area
is socialization, and the focus of improvement is on human operations,
training, and human collaborative processes. The basis of collaboration is human networks, known as communities of practicesharing
purpose, values, and knowledge toward a common mission. The barriers that challenge this area of KM are cultural in nature.
2. Processes. The second KM area focuses on human-computer interaction (HCI) and the processes of externalization and internalization.
Tacit-explicit knowledge conversions have required the development
81
of tacit-explicit representation aids in the form of information visualization and analysis tools, thinking aids, and decision support systems.
This area of KM focuses on the efficient networking of people and
machine processes (such autonomous support processes are referred to
as agents) to enable the shared reasoning between groups of people and
their agents through computer networks. The barrier to achieving
robustness in such KM processes is the difficulty of creating a shared
context of knowledge among humans and machines.
3. Processors. The third KM area is the technological development and
implementation of computing networks and processes to enable
explicit-explicit combination. Network infrastructures, components,
and protocols for representing explicit knowledge are the subject of
this fast-moving field. The focus of this technology area is networked
computation, and the challenges to collaboration lie in the ability to
sustain growth and interoperability of systems and protocols.
Table 3.8
Basic KM Taxonomy for the Intelligence Enterprise
Intelligence Enterprise
KM: Acquiring, Creating, Maintaining, and Applying Knowledge to
Achieve Organizational Objectives
Perspective of
Knowledge
Management
People
Operational View
Knowledge
Conversion
Externalization and
internalization:
transactions between
tacit and explicit
Focus of the
Enterprise
Operations, business
processes, training
Infrastructure,
knowledge, protocols
Basis of
Collaboration
Networks of people
(communities of
practice): shared
purpose, values,
practice, knowledge
Networked
computation: shared
configuration of content in networks and
nodes (computers)
Barriers to
Collaboration and
Interoperation
Context
Processes
Human-Computer
Interaction View
Processors
Technical View
82
Note that these three areas correspond to three basic descriptive views of
the enterprise that will be subsequently introduced in Chapter 9.
The taxonomy can be further extended (Table 3.9) to consider the disciplines and supporting tools and technologies in each of these three areas:
1. People. The objective of people-oriented disciplines is to create
a knowledge-based organization that learns, shares, and creates
knowledge collaboratively. The tools and technologies applied to this
Table 3.9
Taxonomy of Disciplines and Supporting Tools and Technologies
Perspective
of Knowledge
Management
Objective
Disciplines and
Areas of Research
and Development
People
Operational View
Processors
Technical View
Collaborative, learning
organization
Efficient HCI
Effective
human-computer
networks
Collaboration for:
HCI
Data capturing,
representing, and
warehousing
Knowledge
sharing
Human-agent
collaboration
Problem solving
eLearning
Virtual teaming
Automation
Support Tools and
Technologies
Processes
Human-Computer
Interaction View
Virtual team
establishment and
support across time
and space
Automatic experience
capturing and linking
(cases) to problems
Auto training and
eLearning
Networked computing
Data, information, and
high-dimensionality
knowledge presentation
to humans, virtual, and
artificial reality
High-level abstract
interaction between
human and machine
agents
Human-machine
problem solving and
workflow
Data representation,
knowledge mapping to
index, correlating, and
linking (externalizing and
internalizing) knowledge
Search and retrieval
83
discipline range from collaborative services to create virtual (distributed) teams and supporting services to eLearning tools to integrate
learning into the work process.
2. Processes. HCI and related disciplines have the objective of achieving
efficient human-machine interaction, enabling humans-agent teams to
smoothly exchange tacit and explicit knowledge. Tools that support
this process include virtual- and artificial-reality visualizations (and
multisensory presentations), human-machine conversation, and
autonomous agent services to search and explore large data volumes.
3. Processors. Effective computer networks are the objective of the diverse
computing disciplines that support KM: enterprise architecting, networked computing infrastructure, data warehousing, services for
information management, collaboration, cognitive (reasoning) support, and knowledge distribution.
Because the KM field can also be described by the many domains of expertise (or disciplines of study and practice), we can also distinguish five distinct
areas of focus (Table 3.10) that help describe the field. The first two disciplines
view KM as a competence of people and emphasize making people
knowledgeable:
1. Knowledge strategists. Enterprise leaders, such as the chief knowledge
officer (CKO), focus on the enterprise mission and values, defining
value propositions that assign contributions of knowledge to value
(i.e., financial or operational). These leaders develop business models
to grow and sustain intellectual capital and to translate that capital into
organizational values (e.g., financial growth or organizational performance). KM strategists develop, measure, and reengineer business
processes to adapt to the external (business or world) environment.
2. Knowledge culture developers. Knowledge culture development and
sustainment is promoted by those who map organizational knowledge
and then create training, learning, and sharing programs to enhance
the socialization performance of the organization. This includes the
cadre of people who make up the core competencies of the organization (e.g., intelligence analysis, intelligence operations, and collection
management). In some organizations a chief learning officer (CLO) is
designated this role to oversee enterprise human capital, just as the
chief financial officer (CFO) manages (tangible) financial capital.
The next three disciplines view KM as an enterprise capability and emphasize building the infrastructure to make knowledge manageable:
84
3. KM applications. Those who apply KM principles and processes to specific business applications create both processes and products (e.g.,
software application packages) to provide component or end-end services in a wide variety of areas listed in Table 3.10. Some commercial
KM applications have been sufficiently modularized to allow them
to be outsourced to application service providers (ASPs) [20] that
Table 3.10
The Disciplines of KM
Knowledge Perspective Discipline
Making People
Knowledgeable (KM as
a Competence)
1. Knowledge
strategy
2. Knowledge
(learning) culture
developers
Making Knowledge
Manageable (KM as a
Capability)
3. KM
applications
4. Enterprise
architecture
5. Technology
and tools
85
86
87
Account
receivable
Facilities
inventory
$50 M
Cash
Assets
Intangible
Intellectual
Capital
$10 M
Shortterm
debt
$40 M
Market
Value $100 M
Longterm
debt
Organizational Capital
Innovation
Capital
Process
Capital
$10 M
$5 M
Human Capital
Competence,
skill, experience
of workforce
Corporate tacit
knowledge
Culture
Spirit
$20 M
Debt
88
that achieve these values ensure that new ideas move to development
and then to product by accelerating the product development process.
This value emphasizes the transformation of the business, itself (as
explained in Section 1.1).
3. Customer intimacy. These values seek to increase customer loyalty,
customer retention, and customer base expansion by increasing intimacy (understanding, access, trust, and service anticipation) with customers. Actions that accumulate and analyze customer data to reduce
selling cost while increasing customer satisfaction contribute to this
proposition.
For each value proposition, specific impact measures must be defined to
quantify the degree to which the value is achieved. These measures quantify the
benefits, and utility delivered to stakeholders. Using these measures, the value
added by KM processes can be observed along the sequential processes in the
business operation. This sequence of processes forms a value chain that adds
value from raw materials to delivered product. Table 3.11 compares the impact
measures for a typical business operation to comparable measures in the intelligence enterprise.
It should be noted that these measures are applicable to the steady-state
operation of the learning and improving organization. Different kinds of measures are recommended for organizations in transition from legacy business models. During periods of change, three phases are recognized [24]. In the first
phase, users (i.e., consumers, collection managers, and analysts) must be convinced of the benefits of the new approach, and the measures include metrics as
simple as the number of consumers taking training and beginning to use services. In the crossover phase, when users begin to transition to the systems, measurers change to usage metrics. Once the system approaches steady-state use,
financial-benefit measures are applied. Numerous methods have been defined
and applied to describe and quantify economic value, including:
1. Economic value added (EVA) subtracts cost of capital invested from
net operating profit;
2. Portfolio management approaches treats IT projects as individual
investments, computing risks, yields, and benefits for each component
of the enterprise portfolio;
3. Knowledge capital is an aggregate measure of management value
added (by knowledge) divided by the price of capital [25];
4. Intangible asset monitor (IAM) [26] computes value in four categoriestangible capital, intangible human competencies, intangible
internal structure, and intangible external structure [27].
89
Table 3.11
Business and Intelligence Impact Measures
Value Proposition
Customer Intimacy
Product Leadership
Business KM Impact
Measures
Intelligence KM Impact
Measures
Customer retention
Time-to-market cycle
Security record
Safety record
90
Internal
Financial
Define major
measures of
performance
(MOPs)
and effectiveness
(MOEs)
Define the
financial
contribution of
intelligence to
stakeholders
(cost, risk
reduction, etc.)
Identify how
intelligence value
is created by
change and
innovation in
changing threat
environment
Identify
specific goals
and measures
Identify specific
goals and
measures
Identify specific
Identify specific MOP and MOE
goals and
metrics and
measures
goal values
91
The financial view sets goals and measures quantitative financial metrics.
For a commercial company, this includes return metrics (e.g., return on equity,
return on capital employed, or return on investment) that measure financial
returns relative to a capital base. Traditional measures consider only the financial capital base, but newer measures consider return as a function of both tangible (financial, physical) and intangible (human capital) resources to provide a
measure of overall performance using all assets.
The four views of the BSC provide a means of balancing the measurement of the major causes and effects of organizational performance but also provide a framework for modeling the organization. Consider a representative BSC
cause-and-effect model (Figure 3.9) for an intelligence organization that uses the
four views to frame the major performance drivers and strategic outcomes
expected by a major KM initiative [30]. The strategic goals in each area can be
compared to corresponding performance drivers, strategic measures, and the
causal relationships that lead to financial effects. The model explicitly exposes
the presumed causal basis of the strategy and two categories of measures:
1. Performance drivers are lead indicators that precede the desired strategic effects. In Figure 3.9, training and introduction of a collaborative
network (both directly measurable) are expected to result in the strategic outcome of increased staff analytic productivity.
2. Strategic outcomes are the lag indicators that should follow if the
performance drivers are achievedand the strategy is correct. Notice
that this model presumes many hidden causal factors (e.g., analytic
staff morale and workplace efficiency), which should be explicitly
defined when establishing the scorecard.
The model provides a compact statement of strategy (on the left column),
measures (in the boxes), and goals (specific quantitative values can be annotated
in the boxes). In this representative example, the organization is implementing
collaborative analytic training, a collaborative network and integrated lessons
learned database, expecting overall analytic productivity improvements. Internal
operations will be improved by cross-functional collaboration across intelligence
sources (INTs) and single-source analysts, the availability of multi-INT data on
the collaborative net, and by the increased source breadth and analytic depth
provided by increased sharing among analysts. These factors will result in
improved accuracy in long-term analyses for policymakers and more timely
responses to crisis needs by warfighers. The results of these learning and internal
improvements may be measured in customer satisfaction performance drivers,
and then in their perception of the value added by intelligence (the accuracy and
timeliness effects on their decisions).
92
Strategic objective
Performance drivers
Strategic outcome
measures
Financial perspective
Return on
investment
Customer perspective
C1Increase value added
to warfighter
C2Increase value added to
long term policymaker
consumers
I&W
Rate
Target/ID and BDA
Accuracy
Satisfaction
measured
by use/decisions
Issue / request
anticipation
Internal perspective
I1Increase analytic breadth
and depth by applying nets
and tools
I2Make multi-INT data
libraries accessible with
recent data
I3Improve efficiency of
coordinated multi-INT collection
and processing
Learning perspective
L1Develop and apply analytic
lessons learned
L2Improve cross functional
collaboration on
intelligence networks
L3Provide collaborative
analysis training
Analytic breadth
and depth
Threat
awareness
Warfighter
Intel value added
Policy maker
Intel value added
Term estimate
accuracy
Multi-int data
availability
Collection/
processing
efficiency
Crisis
response time
Lessons learned
knowledge base
Collaboration net
availability use
Analytic staff
productivity
Collaborative
analysis training
Ultimately, these effects lead to overall improved intelligence organizational financial performance, measured by the capital value of threat awareness
compared to the capital invested in the organization. (Capital invested includes
93
existing tangible and intangible assets, not just annual budget.) Notice
that the threat awareness is directly related to the all-critical value of reducing
I&W risks (intelligence failures to warn that result in catastrophic lossesthese
are measured by policymakers).
anticipating their intelligence needs, earning greater user trust in accuracy and focus of estimates and warnings, and providing more timely
delivery of intelligence. Service value is also increased as producers personalize (tailor) and adapt services to the consumers interests (needs) as
they change.
Intelligence product values. The value of intelligence products is
increased when greater value is added by improving accuracy, providing deeper and more robust rationale, focusing conclusions, and building increased consumer confidence (over time).
The second category of strategies (prompted by the eBusiness revolution)
seeks to transform the value chain by the introduction of electronic transactions
between the customer and retailer. These strategies use network-based advertising, ordering, and even delivery (for information services like banking, investment, and news) to reduce the friction of physical-world retailer-customer
94
entities between the customer and producer to reduce transaction friction. This friction adds cost and increases the difficulty for buyers to
locate sellers (cost of advertising), for buyers to evaluate products (cost
of travel and shopping), for buyers to purchase products (cost of sales)
and for sellers to maintain local inventories (cost of delivery). The
elimination of middlemen (e.g., wholesalers, distributors, and local
retailers) in eRetailers such as Amazon.com has reduced transaction and
intermediate costs and allowed direct transaction and delivery from
producer to customer with only the eRetailer in between. The effect of
disintermediation in intelligence is to give users greater and more
immediate access to intelligence products (via networks such as the U.S.
Intelink) and to analysis services via intelligence portals that span all
sources of intelligence.
Infomediation. The effect of disintermediation has introduced the role
rapid analysis of customer actions (e.g., queries for information, browsing through catalogs of products, and purchasing decisions based on
information). This analysis enables the producers to better understand
customers, aggregate their behavior patterns, and react to (and perhaps
anticipate) customer needs. Commercial businesses use these capabilities to measure individual customer patterns and mass market trends to
more effectively personalize and target sales and new product developments. Intelligence producers likewise are enabled to analyze warfighter
and policymaker needs and uses of intelligence to adapt and tailor products and services to changing security threats.
95
These value chain transformation strategies have produced a simple taxonomy to distinguish eBusiness models into four categories by the level of transaction between businesses and customers (Figure 3.10) [32]. Each model has
direct implications for intelligence product and service delivery:
1. Business to business (B2B). The large volume of trade between businesses (e.g., suppliers and manufacturers) has been enhanced by
network-based transactions (releases of specifications, requests for quotations, and bid responses) reducing the friction between suppliers and
producers. High-volume manufacturing industries such as the automakers are implementing B2B models to increase competition among
suppliers and reduce bid-quote-purchase transaction friction. This is
equivalent to process models that enable efficient electronic transactions between intelligence source providers (e.g., the IMINT,
SIGINT, or other stovepipes), which allow cross-source cueing, coordinated multiple-source collection, data fusion, and mining.
2. Business to customer (B2C). Direct networked outreach from producer
to consumer has enabled the personal computer (e.g., Dell Computer)
and book distribution (e.g., Amazon.com) industries to disintermediate local retailers and reach out on a global scale directly to customers.
Similarly, intelligence products are now being delivered (pushed) to
consumers on secure electronic networks, via subscription and express
order services, analogous to the B2B model.
To
Customers
Business
B2B
GM, Ford, D-C
B2C
Amazon.com
Dell Computers
Customers
C2B
Priceline.com
C2C
eBay
From
Business
Infomediary services
to enable users to
network and collaborate
Intelligence equivalents
to business models
96
returns are measured in terms of intelligence performance (i.e., knowledge provided, accuracy and timeliness of delivery, and completeness
and sufficiency for decision making) and outcomes (i.e., effects of warnings provided, results of decisions based on knowledge delivered, and
utility to set long-term policies).
Service customers, the intelligence consumers. This is done by provid-
97
Enterprise
strategy
Competitive
intelligence
Business operations
Customer
service
Front office
applications
Customers
Order capture
Tasking
Plan-schedule
Collection
Human
resources mgt
Operations
analysis
Financial and
intellectual
capital mgt
Business Intelligence
Procurement
Manufacture
Logistics
shipping
Processing
analysis
Back office
applications
Dissemination
Figure 3.11 Enterprise architecture model related to the intelligence business model.
98
goals are achieved as an effect of these other activities; for this reason, attention
to the architecture is critical to establish the organizations functional base,
which minimizes transaction friction in front- and back-office functions.
3.7.1
CRM processes that build and maintain customer loyalty focus on managing the
relationship between provider and consumer. The short-term goal is customer
satisfaction; the long-term goal is loyalty. Intelligence CRM seeks to provide
intelligence content to consumers that anticipates their needs, focuses on the
specific information that supports their decision making, and provides drill
down to supporting rationale and data behind all conclusions. In order to
accomplish this, the consumer-producer relationship must be fully described in
models that include:
Consumer needs and uses of intelligenceapplications of intelligence
The SCM function monitors and controls the flow of the supply chain, providing internal control of planning, scheduling, inventory control, processing, and
delivery. Building on earlier generation enterprise resource planning (ERP)
functions, SCM functions can also extend beyond the enterprise to coordinate
the supply chain with suppliers (at the front end) and external customers (at the
99
Table 3.12
Business Customer and Intelligence Consumer Relationship Analogies
Business CRM
Intelligence CRM
The customer: a purchaser of goods and services. The intelligence consumer: an authorized person
who uses intelligence or intelligence information
directly in the decision-making process or to
produce other intelligence
Business CRM Functions
1. Track customer catalog browsing to understand 1. Track consumer intelligence portal browsing to
interests and trends
understand interests and trends
2. Track and record customer transaction history: 2. Track and record consumer transactions:
inquiries, shopping (browsing offerings),
inquiries for reports, searches for online data,
purchases, returns, satisfaction survey responses requests for intelligence tasking (e.g. topics or
urgencies), uses of intelligence (e.g., decisions
made based on intelligence, benefits of
intelligence, or feedback)
3. Analyze individual customer buying patterns
delivery end). SCM is the core of B2B business models, seeking to integrate
front-end suppliers into an extended supply chain that optimizes the entire production process to slash inventory levels, improve on-time delivery, and reduce
the order-to-delivery (and payment) cycle time. In addition to throughput efficiency, the B2B models seek to aggregate orders to leverage the supply chain to
gain greater purchasing power, translating larger orders to reduced prices. The
key impact measures sought by SCM implementations include:
100
payment);
Delivery performance (percentage of orders fulfilled on or before
request date);
Initial fill rate (percentage of orders shipped in suppliers first shipment);
Initial order lead time (supplier response time to fulfill order);
On-time receipt performance (percentage of supplier orders received on
time).
Business Intelligence
101
Table 3.13
Business and Intelligence SCM Analogies
Business SCM
Intelligence SCM
Supply chainorder, plan, procure, produce, ship Supply chainplan, task collect, acquire data,
(delivery, order fulfillment)
analyze all-source data, produce intelligence
products, and disseminate
External customers
The emphasis of BI is on explicit data capture, storage, and analysis; through the
1990s, BI was the predominant driver for the implementation of corporate data
warehouses, and the development of online analytic processing (OLAP) tools.
(BI preceded KM concepts, and the subsequent introduction of broader KM
concepts added the complementary need for capture and analysis of tacit and
explicit knowledge throughout the enterprise [34].)
BI implementations within an intelligence organization provide intelligence about intelligenceinsight into the operation flow through the
102
intelligence cycle. The intelligence BI function should collect and analyze realtime workflow data to provide answers to questions such as:
What are the relative volumes of requests (for intelligence) by type?
What is the cost of each category of intelligence product?
What are the relative transaction costs of each stage in the supply chain?
What are the trends in usage (by consumers) of all forms of intelligence
over the past 12 months? Over the past 6 months? Over the past week?
Which single sources of incoming intelligence (e.g., SIGINT, IMINT,
3.8 Summary
KM provides a strategy and organizational discipline for integrating people,
processes, and IT into an effective enterprise. The development of KM as a discipline has moved through phases of emphasis, as noted by Tom Davenport, a
leading observer of the discipline:
The first generation of knowledge management within enterprises emphasized the supply side of knowledge: acquisition, storage, and dissemination of business operations and customer data. In this phase knowledge was
treated much like physical resources and implementation approaches
focused on building warehouses and channels for supply processing and
distribution. This phase paid great attention to systems, technology and
infrastructure; the focus was on acquiring, accumulating and distributing
explicit knowledge in the enterprise [35].
103
Perspective of KM
Focus of the
Enterprise
Knowledge
Transactions
People
Operational View
Processes
HCI View
Technology
Technical View
Operations, processes,
training
Infrastructure,
knowledge protocols
Internalization and
externalization
transactions between
tacit and explicit
Chapter 4: The
Chapter 5: Intelligence
Subsequent Chapters
knowledge-based
analysis and synthesis
in This Book
intelligence organization Chapter 6: Implementing analysis-synthesis
Chapter 7: Knowledge
transfer and
transaction
Chapter 8: Knowledge
combination
Chapter 9: Enterprise
architecture
104
gies (computing processes, processing nodes, and interconnecting network technologies) that constitute the implementation of the
architecture of the intelligence enterprise.
Endnotes
[1] ODell, C. and Grayson, C. J., Jr., If Only We Knew What We Know, New York: Free
Press, 1998.
[2] KM definitions are quoted from the DoD Web site: http://center.dau.mil/Topical_
Sessions_templates/Knowledge_Management/Definitions_of_Knowledge_Management.htm.
[3] Watson, S., Getting to AHA! ComputerWorld, January 26, 1998.
[4] NSA adopted the definition from the APQC. See Brooks, C. C.,Knowledge Management and the Intelligence Community, Defense Intelligence Journal, Vo. 9, No.1, Winter
2000, p.17. This issue of the journal is devoted to KM in the U.S. IC.
[5] Table 3.2 is based on Advancing Knowledge Management in DoD: A Primer for Executives
and Practitioners, Directorate of eBusiness & Knowledge Management, OASD/C3I,
September 2000, p. 2.
[6] Defense Planning Guidance FY02-07, April 2000, p. 102.
[7] Some texts refer to embedded knowledge as a third category; this knowledge integrated in
business processes can include either unconscious human process skills (tacit) or explicitly
coded computer programs (explicit).
[8] Polanyi, M., The Tacit Dimension, Garden City, NY: Doubleday, 1966.
[9] Polanyi, M., The Tacit Dimension, Garden City, NY: Doubleday, 1966, p. 7.
[10] Descartes published his Discourse on Method in 1637 and described his four-step
problem-solving method of analysis and synthesis in Part IIPrincipal Rules of the
Method.
[11] Pascal, B., Pensees, Part IV, The Means of Belief, 1660, para. 277.
[12] Devlin, K., Goodbye, Descartes: The End of Logic and the Search for a New Cosmology of the
Mind, New York: John Wiley & Sons, 1997, pp. vii and ix.
[13] This is not to imply that the human mind is a new intelligence target; leadership intentions and customer purchasing intentions are the targets of national and business intelligence, respectively. New representations will permit more accurate modeling of these
targets.
[14] Categories are adapted from: Davidow, W. H., and M. S. Malone, The Virtual Corporation, Chapter 3, New York: Harper-Collins, 1992.
105
[15] The U.S. Defense Modeling and Simulation Office distinguishes model as A physical,
mathematical or otherwise logical representation of a system, entity, phenomenon or
process, and a simulation as, A method for implementing a model over time. Models
are essentially static representations, while simulations add dynamic (temporal) behavior.
[16] Weick, K., Sensemaking in Organizations, Thousand Oaks, CA: Sage Publications, 1995.
[17] Davenport, T. H., and Prusak, L., Working Knowledge: How Organizations Manage What
They Know, Boston: Harvard Business School Press, 1998.
[18] Nonaka, I., and H. Takeuchi, The Knowledge-Creating Company: How Japanese Companies
Create the Dynamics of Innovation, New York: Oxford University Press, 1995.
[19] Nonaka, I., and H. Takeuchi, The Knowledge-Creating Company: How Japanese Companies
Create the Dynamics of Innovation, New York: Oxford University Press, 1995, p. 72.
[20] The ASP business model is also called managed service provider, netsourcing, or total service provider models.
[21] Sveiby, K. E., The Invisible Balance Sheet, September 8, 1997, updated October 2001,
accessed on-line January 1, 2003 at http://www.sveiby.com/InvisibleBalance.htm.
[22] Effective Use of Intelligence, Notes on Analytic Tradecraft, Note 11, CIA Directorate of
Intelligence, February 1996, p. 2.
[23] ODell, C. and Grayson, C. J., Jr. If Only We Knew What We Know, New York: Free
Press, 1998, p. 133.
[24] Pastore, R., Noodling Numbers, CIO, May 1, 2000, p. 122.
[25] Strassman, P. A., The Value of Knowledge Capital, American Programmer, March 1998.
See also Strassman, P. A., Knowledge Capital, New Canaan, CT: The Information Economics Press, 1999.
[26] Sveiby, K. E., The New Organizational Wealth: Managing and Measuring Knowledge-Based
Assets, San Francisco: Berrett-Koehler, 1997. ICM is similar to the introductory concept
illustrated in Figure 3.7.
[27] See Skyrme, D., Measuring the Value of Knowledge, London: Business Intelligence, 1998.
[28] Kaplan, R. S. and D. P. Norton, The Balanced Scorecard, Boston: Harvard Business School
Press, 1996; see also Kaplan, R. S., and D. P. Norton, Using the Balanced Scorecard as a
Strategic Management System, Harvard Business Review, JanuaryFebruary 1996.
[29] A Consumers Guide to Intelligence, Washington D.C.: CIA, n.d., p.42.
[30] This example model follows the approach introduced by Kaplan and Norton in The Balanced Scorecard, Chapter 7.
[31] Hagel, J., and M. Singer, Net Worth, Boston: Harvard Business Review Press, 1999.
[32] This figure is adapted from, E-Commerce Survey, The Economist, February 26, 2000, p. 9.
[33] The concept of just in time delivery results from efficient supply chain management and
results in reduced inventories (and cost of inventory holdings.) The inventory reduction
benefits of just in time delivery of physical products can be high; for intelligence, the benefits of inventory (information) reductions are not as great, but the benefits of making
106
[34] Nylund, A. L., Tracing the BI Family Tree, Knowledge Management, July 1999,
pp. 7071.
[35] See Davenport, T., Knowledge Management, Round 2, CIO Magazine, November 1,
1999, p. 30; for a supporting viewpoint, see also Karlenzig, W., Senge on Knowledge,
Knowledge Management, July 1999, p. 22.
[36] Firestone, J. M., Accelerated Innovation and KM Impact, White Paper 14, Executive Information Systems, Inc., December 1999.
4
The Knowledge-Based Intelligence
Organization
National intelligence organizations following World War II were characterized
by compartmentalization (insulated specialization for security purposes) that
required individual learning, critical analytic thinking, and problem solving by
small, specialized teams working in parallel (stovepipes or silos). These stovepipes
were organized under hierarchical organizations that exercised central control.
The approach was appropriate for the centralized organizations and bipolar
security problems of the relatively static Cold War, but the global breadth and
rapid dynamics of twenty-first century intelligence problems require more agile
networked organizations that apply organizationwide collaboration to replace
the compartmentalization of the past. Founded on the virtues of integrity and
trust, the disciplines of organizational collaboration, learning, and problem solving must be developed to support distributed intelligence collection, analysis,
and production.
This chapter focuses on the most critical factor in organizational knowledge creationthe people, their values, and organizational disciplines. The
chapter is structured to proceed from foundational virtues, structures, and communities of practice (Section 4.1) to the four organizational disciplines that support the knowledge creation process: learning, collaboration, problem solving,
and best practicescalled intelligence tradecraft (Sections 4.24.5, respectively). The emphasis in this chapter is in describing organizational qualities and
their application in intelligence organizations.
Notice that the people perspective of KM presented in this chapter can be
contrasted with the process and technology perspectives (Table 4.1) five ways:
107
108
Perspective of KM
People
Operational View
Processes
HCI View
Technology
Technical View
Focus of the
Enterprise
Operations, processes,
training
Infrastructure,
collaboration protocols
Primary Modes
of Knowledge
Transaction
Internalization and
externalization:
transactions between
tacit and explicit
Basis of
Collaboration
Shared business
processes, tacit, and
explicit knowledge
Collaboration
Enablers
Collaboration rules of
engagement
Asynchronous and
synchronous groupware
Communities of practice
Organizational learning
and problem solving
Tradecraft best practices
Barriers to
Collaboration
Culture
Context
Content
1. Enterprise focus. The focus is on the values, virtues, and mission shared
by the people in the organization.
2. Knowledge transaction. Socialization, the sharing of tacit knowledge by
methods such as story and dialogue, is the essential mode of transaction between people for collective learning, or collaboration to solve
problems.
3. Collaboration. The basis for human collaboration lies in shared purpose, values, and a common trust.
4. Enablers. A culture of trust develops communities that share their best
practices and experiences; collaborative problem solving enables the
growth of the trusting culture.
5. Barriers. The greatest barrier to collaboration is the inability of an
organizations culture to transform and embrace the sharing of values,
virtues, and disciplines.
109
implementation of training and business processes to develop an interagency collaborative culture and the deployment of supporting
technologies.
Invest in people and knowledge. This area includes the assessment of customer needs and the conduct of events (training, exercises, experiments,
and conferences/seminars) to develop communities of practice and
build expertise in the staff to meet those needs. Supporting infrastructure developments include the integration of collaborative networks
and shared knowledge bases.
Speaking like a corporate officer, the chairman of the U.S. National Intelligence Council emphasized the importance of collaboration to achieve speed
and accuracy while reducing intelligence production costs:
110
two types of collaborative tools are needed: collaboration in the production processto increase speed and accuracyand expertise-based collaborationto enable teams of analysts to work on a project for several weeks or
months. These new collaborative tools will allow analysts to discuss contentious analytical issues, share information like maps, imagery, and database information, and coordinate draft assessments, all on-line, from their
own workspaces, resulting in substantial savings of time and effort [3].
111
Activities Performed
Moral Virtues
Intellectual Virtues
Individuals
Integrity
Agility
Creativity
Personal adaptation
Imagination
Cooperation
Truth
Diversity of minds
Collaboration
Trustworthiness
Openness
Knowledge sharing
Collaborative knowledge
creation, problem solving, and
process improvement
Entire Organization
112
Here, Davenport emphasizes the virtues of agility (ability to change rapidly), creativity (ability to explore intellectually), and openness (ability to trust
inherently) as necessary conditions for competitive organizational performance
in a rapidly changing world. These virtues are necessities to enable the concept
of organizational revolution introduced in Chapter 1. When organizations must
continuously change even the business model itself, creativity, agility, and openness are required throughout the organization to maintain stability in the presence of such dynamics. Business models may change but virtues and purpose
provide the stability.
4.1.2
Every organization has a structure and flow of knowledgea knowledge environment or ecology (emphasizing the self-organizing and balancing characteristics of organizational knowledge networks). The overall process of studying
and characterizing this environment is referred to as mappingexplicitly representing the network of nodes (competencies) and links (relationships,
knowledge flow paths) within the organization. The fundamental role of KM
organizational analysis is the mapping of knowledge within an existing organization. As a financial audit accounts for tangible assets, the knowledge mapping identifies the intangible tacit assets of the organization. The mapping
process is conducted by a variety of means: passive observation (where the analyst works within the community), active interviewing, formal questionnaires,
and analysis. As an ethnographic research activity, the mapping analyst seeks
to understand the unspoken, informal flows and sources of knowledge in the
day-to-day operations of the organization. The five stages of mapping (Figure
4.1) must be conducted in partnership with the owners, users, and KM
implementers.
The first phase is the definition of the formal organization chartthe formal flows of authority, command, reports, intranet collaboration, and information systems reporting. In this phase, the boundaries, or focus of mapping
interest is established. The second phase audits (identifies, enumerates, and
quantifies as appropriate) the following characteristics of the organization:
113
Interview
Participate
Inspect
docs
AAN
ACS
AFKM
AI
AKMS
AKO
ARDA
ASAS
ASD
AWE
B2B
B2C
BI
BPR
C2B
C2C
C2W
C4ISR
CAD
CAN
CDL
CI
CIA
CINC
CIO
CM
CMO
CND
Define
Audit
Organize
Map
Identify the
high-level
organizational
structure and
areas of study:
Boundaries
Areas of focus
Study, analyze
the people and
products; find
the:
Flow paths
Repositories
Bottlenecks
Competencies
Communities
Catalog the
data to allow
analysis:
Knowledge
objects
Sources
Sinks
Categories
Values, utility
Anomalies
Identify the
knowledge
structure:
Clusters
Critical paths
Repositories
Formal and
informal
Find the gaps
Benchmark
and plan
Benchmark
with others;
plan the
transition
changes:
Cultural
Communities
Flow paths
Processes
Training
1. Knowledge sourcesthe people and systems that produce and articulate knowledge in the form of conversation, developed skills,
reports, implemented (but perhaps not documented) processes, and
databases.
2. Knowledge flowpathsthe flows of knowledge, tacit and explicit, formal and informal. These paths can be identified by analyzing the
transactions between people and systems; the participants in the transactions provide insight into the organizational network structure by
which knowledge is created, stored, and applied. The analysis must
distinguish between seekers and providers of knowledge and their relationships (e.g., trust, shared understanding, or cultural compatibility)
and mutual benefits in the transaction.
3. Boundaries and constraintsthe boundaries and barriers that control,
guide, or constrict the creation and flow of knowledge. These may
include cultural, political (policy), personal, or electronic system characteristics or incompatibilities.
4. Knowledge repositoriesthe means of maintaining organizational
knowledge, including tacit repositories (e.g., communities of experts
that share experience about a common practice) and explicit storage
(e.g., legacy hardcopy reports in library holdings, databases, or data
warehouses).
114
Once audited, the audit data is organized in the third phase by clustering
the categories of knowledge, nodes (sources and sinks), and links unique to the
organization. The structure of this organization, usually a table or a spreadsheet,
provides insight into the categories of knowledge, transactions, and flow paths;
it provides a format to review with organization members to convey initial
results, make corrections, and refine the audit. This phase also provides the
foundation for quantifying the intellectual capital of the organization, and the
audit categories should follow the categories of the intellectual capital accounting method adopted (e.g., balanced scorecard as described in Chapter 3). The
process of identifying the categories of knowledge used by the organization
develops a taxonomy (structure) of the knowledge required to operate the
organization. The taxonomy, like the Dewey decimal system in a library, forms
the structure for indexing the organizations knowledge base, creating directories
of explicit knowledge and organizational tacit expertise.
The fourth phase, mapping, transforms the organized data into a structure
(often, but not necessarily, graphical) that explicitly identifies the current
knowledge network. Explicit and tacit knowledge flows and repositories are distinguished, as well as the social networks that support them. This process of
visualizing the structure may also identify clusters of expertise, gaps in the flows,
chokepoints, as well as areas of best (and worst) practices within the network.
Once the organizations current structure is understood, the structure can
be compared to similar structures in other organizations by benchmarking in the
final phase. Benchmarking is the process of identifying, learning, and adapting
outstanding practices and processes from any organization, anywhere in the
world, to help an organization improve its performance. Benchmarking gathers
the tacit knowledgethe know-how, judgments, and enablersthat explicit
knowledge often misses [11]. This process allows the exchange of quantitative
performance data and qualitative best-practice knowledge to be shared and compared with similar organizations to explore areas for potential improvement and
potential risks. This process allows the organization to plan cultural, infrastructure, and technology changes to improve the effectiveness of the knowledge network. The process also allows for comparison of plans with the prior experience
of other organizations implementing change.
The U.S. CIA has performed a mapping analysis to support KM enterprise
automation initiatives, as well as to satisfy its legal requirements to store records
of operations and analyses [12]. The agency implemented a metadata repository
that indexes abstracted metadata (e.g., author, subject per an intelligence taxonomy, date, and security level) for all holdings (both hardcopy and softcopy)
across a wide variety of databases and library holdingsall at multiple levels of
security access. The repository allows intelligence officers to search for holdings
on topics of interest, but a multiple-level security feature limits their search
reporting to those holdings to which they have access. Because the repository
115
provides only abstracts, access may still be controlled by the originator (i.e., originators of a document maintain security control and may choose to grant access to
a complete document on an individual need-to-know basis when requested). The
single repository indexes holdings from multiple databases; metadata is automatically generated by tools that read existing and newly created documents. Because
the repository provides a pointer to the originating authors, it also provides critical pointers to people, or a directory that identifies people within the agency with
experience and expertise by subject (e.g., country, intelligence target, or analytic
methods). In this sense, the repository indexes tacit knowledge (the people with
expertise) as well as explicit knowledge (the documents).
4.1.3
Description
Intelligence Examples
Entire organization
Functional workgroup
Cross-functional
project team
Community of practice
Informal community
116
117
of Xerox PARC, have indicated that once these activities are unified in communities of practice, they have the potential to significantly enhance knowledge
transfer and creation [14].
Finally, organizational knowledge mappings generally recognize the existence of informal (or underground) communities of knowledge sharing among
friends, career colleagues, and business acquaintances. These informal paths
extend outside the organization and can be transient, yet provide valuable
sources of insight (because they often provide radically different perspectives)
and threatening sources of risk (due to knowledge leakage). Security plans and
knowledge mappings must consider these informal paths.
4.1.4
Initiating KM Projects
The knowledge mapping and benchmarking process must precede implementation of KM initiatives, forming the understanding of current competencies and
processes and the baseline for measuring any benefits of change. KM implementation plans within intelligence organizations generally consider four components, framed by the kind of knowledge being addressed and the areas of
investment in KM initiatives (Figure 4.2) [15]:
1. Organizational competencies. The first area includes assessment of
workforce competencies and forms the basis of an intellectual capital
audit of human capital. This area also includes the capture of best
KM initiative
Knowledge type
Explicit
Tactic
Organization
(people, processes)
Infrastructure
(technology, processes)
Competencies
Education, training
Best practice,
business process
capture
Human capital
analysis
KM network
KM Intranets,
collaborative groupware,
analytic tools
Best practice
databases
Knowledge directories
Social collaboration
Communities of
practice enhancement
and formation
Knowledge creation
socialization events
Human capital
analysis
Virtual collaboration
Virtual teams and
communities of practice
Videoconference and
collaborative groupware
118
The KM community has recognized the strength of narrative communicationdialogue and storytellingto communicate the values, emotion (feelings,
passion), and sense of immersed experience that makeup personalized, tacit
knowledge. Such tacit communication is essential in the cultural transformation
necessary for collaborative knowledge sharing (the socialization process of the
knowledge spiral introduced in Chapter 3) and organizational process innovation. The introduction of KM initiatives can bring significant organizational
change because it may require cultural transitions in several areas:
Changes in purpose, values, and collaborative virtues;
Construction of new social networks of trust and communication;
Organizational structure changes (networks replace hierarchies);
Business process agility, resulting a new culture of continual change
119
while storytelling refers to the more formal method of creating a story to convey
tacit knowledge to a large group. Both methods employ a narrative mode of
thought and highly interactive knowledge constructs to communicate to active
recipients, in contrast with the abstract, analytical mode used by logicians and
mathematics to communicate explicit knowledge (Table 4.4).
Storytelling provides a complement to abstract, analytical thinking and
communication, allowing humans to share experience, insight, and issues (e.g.,
unarticulated concerns about evidence expressed as negative feelings, or general
impressions about repeated events not yet explicitly defined as threat patterns).
KM consultant Stephen Denning, who teaches the methodology for creation of
story narratives, describes the complementary nature of these modes:
Storytelling doesnt replace analytical thinking. It supplements it by enabling us to imagine new perspectives and new worlds, and is ideally suited
Table 4.4
Analytical and Storytelling Modes of Thought and Communication
Mode of Thought
and Communication
Knowledge Constructs
Abstract
Analytical Mode Communicate
Explicit Knowledge
Narrative
Storytelling Mode
Communicate Tacit Knowledge
Communication Form
Knowledge Presenter
Y = mx + b
Passive spectator
Active participant
120
The organic school of KM that applies storytelling to cultural transformation emphasizes a human behavioral approach to organizational socialization,
accepting the organization as a complex ecology that may be changed in a large
way by small effects. These effects include the use of a powerful, effective story
that communicates in a way that spreads credible tacit knowledge across the
entire organization [17]. This school classifies tacit knowledge into artifacts,
skills, heuristics, experience, and natural talents (the so-called ASHEN classification of tacit knowledge) and categorizes an organizations tacit knowledge in
these classes to understand the flow within informal communities.
The organic approach to organizational culture change (Figure 4.3) first
understands the organizational tacit knowledge by capturing anecdotal stories
that describe the operation, experiences, and implicit values of the organization.
From the oral repository of tacit knowledge in these stories, the KM researcher
maps the culture, describing high-level patterns of behavior in archetypes and stories (e.g., exemplar people and roles in the organization and representative knowledge transaction processes, both good and bad). From the analysis of these
archetypal stories, the KM researcher can synthesize or select real stories that communicate desired organizational values and create cross-cultural understanding.
This organic research approach has been developed and applied by KM
consultant and researcher Dave Snowden, who has immersed himself in organizations to observe the flow of knowledge among communities of practice.
Snowdon has noted:
One of the paradoxes is that informal communities are the real dynamos of
knowledge. If you build strong boundaries between formal and informal
communities, you get increased knowledge flows. But if you try to break
Analysis
Mapping knowledge
Capturing
field anecdotes Assess culture by
archetypes and story
(narratives)
Oral history knowledge
repository
Synthesis
Story management
to guide culture
Learning via the
metaphor and the
archetype story
Cross-culture
understanding
and integration
121
the boundaries down, the informal knowledge goes offsite because people
dont feel secure [18].
Snowdons studies revealed this need for a delicate balance between sharing and protection of informal and formal information flows. Attempts to put
everything on-line may bring organizational insecurity (and reluctance to
share). Nurturing informal sharing within secure communities of practice and
distinguishing such sharing from formal sharing (e.g., shared data, best practices, or eLearning) enables the rich exchange of tacit knowledge when creative
ideas are fragile and emergent.
toward the end of personal and organization growth. The desire to learn
must be to seek a clarification of ones personal vision and role within
the organization.
Systems thinking. Senge emphasized holistic thinking, the approach for
122
understands the diversity of its makeupand adapts its behaviors, patterns of interaction, and dialogue to enable growth in personal and
organizational knowledge.
It is important, here, to distinguish the kind of transformational learning
that Senge was referring to (which brings cultural change across an entire
organization), from the smaller scale group learning that takes place when an
intelligence team or cell conducts a long-term study or must rapidly get up to
speed on a new subject or crisis. Large-scale organizationwide transformational
learning addresses the long-term culture changing efforts to move whole organizations toward collaborative, sharing cultures. Group learning and Senges personal mastery, on the other hand, includes the profound and rapid growth of
subject matter knowledge (intelligence) that can occur when a diverse intelligence team collaborates to study an intelligence target. In the next subsections,
we address the primary learning methods that contribute to both.
4.2.1
The process of group learning and personal mastery requires the development of
both reasoning and emotional skills. The level of learning achievement can be
assessed by the degree to which those skills have been acquired. Researcher Benjamin Bloom and a team of educators have defined a widely used taxonomy of
the domains of human learning: knowledge, attitude, and skills (KAS) [20].
These three areas represent the cognitive (or mental skills), affective (attitude or
emotional skills), and psychomotor (manual or physical movement) domains of
human learning.
123
The goal of organizational learning is the development of maturity at the organizational levela measure of the state of an organizations knowledge about its
domain of operations and its ability to continuously apply that knowledge to
increase corporate value to achieve business goals.
Carnegie-Mellon University Software Engineering Institute has defined a
five-level People Capability Maturity Model (P-CMM ) that distinguishes
five levels of organizational maturity, which can be measured to assess and
Table 4.5
Cognitive, Explicit Learning Domain Skills Applied to Intelligence
Cognitive
Explicit Knowing
Mental Reasoning Skills
Intelligence Example:
Foreign Threat Analysis
2. Comprehendinginterpreting problems,
translating and relating data to information,
assigning meaning
5. Synthesizingconstructing concepts from 5. Synthesizes models of national leadership intention formation, planning, and decision making
components and assigning new meanings
6. Evaluatingmaking judgments about the 6. Evaluates models and hypotheses, comparing
values of concepts for decision making
and adapting models as time progresses to asses
the utility of models and competing hypotheses
124
1. Initial
2. Managed
3. Defined
5. Optimized
125
Learning Modes
We gain experience by informal modes of learning on the job alone, with mentors, team members, or while mentoring others. The methods of informal learning are as broad as the methods of exchanging knowledge introduced in the last
chapter. But the essence of the learning organization is the ability to translate
what has been learned into changed organizational behavior. David Garvin has
identified five fundamental organizational methodologies that are essential to
implementing the feedback from learning to change; all have direct application
in an intelligence organization [22].
1. Systematic problem solving. Organizations require a clearly defined
methodology for describing and solving problems, and then for implementing the solutions across the organization. Methods for acquiring
and analyzing data, synthesizing hypothesis, and testing new ideas
must be understood by all to permit collaborative problem solving.
(These methods are described in Section 4.4 of this chapter.) The
process must also allow for the communication of lessons learned and
best practices developed (the intelligence tradecraft) across the
organization.
2. Experimentation. As the external environment changes, the organization must be enabled to explore changes in the intelligence process.
This is done by conducting experiments that take excursions from the
normal processes to attack new problems and evaluate alternative tools
and methods, data sources, or technologies. A formal policy to encourage experimentation, with the acknowledgment that some experiments
will fail, allows new ideas to be tested, adapted, and adopted in the
normal course of business, not as special exceptions. Experimentation
can be performed within ongoing programs (e.g., use of new analytic
tools by an intelligence cell) or in demonstration programs dedicated
126
to exploring entirely new ways of conducting analysis (e.g., the creation of a dedicated Web-based pilot project independent of normal
operations and dedicated to a particular intelligence subject domain).
3. Internal experience. As collaborating teams solve a diversity of intelligence problems, experimenting with new sources and methods, the
lessons that are learned must be exchanged and applied across the
organization. This process of explicitly codifying lessons learned and
making them widely available for others to adopt seems trivial, but in
practice requires significant organizational discipline. One of the great
values of communities of common practice is their informal exchange
of lessons learned; organizations need such communities and must
support formal methods that reach beyond these communities. Learning organizations take the time to elicit the lessons from project teams
and explicitly record (index and store) them for access and application
across the organization. Such databases allow users to locate teams
with similar problems and lessons learned from experimentation, such
as approaches that succeeded and failed, expected performance levels,
and best data sources and methods.
4. External sources of comparison. While the lessons learned just described
applied to self learning, intelligence organizations must look to external sources (in the commercial world, academia, and other cooperating
intelligence organizations) to gain different perspectives and experiences not possible within their own organizations. A wide variety of
methods can be employed to secure the knowledge from external perspectives, such as making acquisitions (in the business world), establishing strategic relationships, the use of consultants, establishing
consortia. The process of sharing, then critically comparing qualitative
and quantitative data about processes and performance across organizations (or units within a large organization), enables leaders and
process owners to objectively review the relative effectiveness of alternative approaches. Benchmarking is the process of improving performance by continuously identifying, understanding, and adapting
outstanding practices and processes found inside and outside the
organization [23]. The benchmarking process is an analytic process
that requires compared processes to be modeled, quantitatively measured, deeply understood, and objectively evaluated. The insight gained
is an understanding of how best performance is achieved; the knowledge is then leveraged to predict the impact of improvements on overall organizational performance.
5. Transferring knowledge. Finally, an intelligence organization must
develop the means to transfer people (tacit transfer of skills,
127
In addition to informal learning, formal modes provide the classical introduction to subject-matter knowledge. For centuries, formal learning has focused on
a traditional classroom model that formalizes the roles of instructor and student
and formalizes the learning process in terms of courses of study defined by a syllabus and learning completion defined by testing criteria. The advent of electronic storage and communication has introduced additional formal learning
processes that allow the process to transcend space-time limitations of the traditional classroom. Throughout the 1980s and 1990s, video, communication, and
networking technologies have enabled the capture, enhancement, and distribution of canned and interactive instructional material. These permit wider distribution of instructional material while enriching the instruction with student
interaction (rather than passive listening to lectures). Information technologies
have enabled four distinct learning modes that are defined by distinguishing
both the time and space of interaction between the learner and the instructor
(Figure 4.4) [25]:
1. Residential learning (RL). Traditional residential learning places the
students and instructor in the physical classroom at the same time and
place. This proximity allows direct interaction between the student
and instructor and allows the instructor to tailor the material to the
students.
2. Distance learning remote (DL-remote). Remote distance learning provides live transmission of the instruction to multiple, distributed locations. The mode effectively extends the classroom across space to reach
128
Same
Different
1. Residential
learning
Different
Instructor-student place
Same
2. Distance
learning
(DL)remote
classroom
3. DLcanned
Synchronous
learning
Asynchronous
learning
4. DL
collaboration
a wider student audience. Two-way audio and video can permit limited interaction between extended classrooms and the instructor.
While RL and DL-remote synchronize instruction and learning
at the same time, the next two modes are asynchronous, allowing
learning to occur at a time and place separate from the instructors
presentation.
3. Distance learning canned (DL-canned). This mode simply packages (or
cans) the instruction in some media for later presentation at the students convenience (e.g., traditional hardcopy texts, recorded audio or
video, or softcopy materials on compact discs) DL-canned materials
include computer-based training courseware that has built-in features
to interact with the student to test comprehension, adaptively present
material to meet a students learning style, and link to supplementary
materials to the Internet.
4. Distance learning collaborative (DL-collaborative). The collaborative
mode of learning (often described as e-learning) integrates canned
material while allowing on-line asynchronous interaction between the
student and the instructor (e.g., via e-mail, chat, or videoconference).
Collaboration may also occur between the student and software
agents (personal coaches) that monitor progress, offer feedback, and
recommend effective paths to on-line knowledge.
129
130
This process of collaboration requires a team (two or more) of individuals that shares a common purpose, enjoys mutual respect and trust, and has an
established process to allow the collaboration process to take place. Four levels
(or degrees) of intelligence collaboration can be distinguished, moving toward
increasing degrees of interaction and dependence among team members
(Table 4.7) [27].
The process of collaboration can occur within a small team of colocated
individuals assigned to a crisis team or across a broad community of intelligence
planners, collection managers, analysts, and operations personnel distributed
across the globe. Collaborative teams can be short lived (e.g., project and crisis
teams) or long term (e.g., communities of common intellectual practice). The
means of achieving collaboration across this wide range of teams all require the
creation of a collaborative culture and the establishment of an appropriate environment and workflow (or collaborative business processes). Sociologists have
studied the sequence of collaborative groups as they move from inception to
decision commitment. Decision emergence theory (DET) defines four stages of
collaborative decision making within an individual group: orientation of
all members to a common perspective; conflict, during which alternatives are
compared and competed; emergence of collaborative alternatives; and finally
Table 4.7
Levels of Cooperative Behavior
Level of
Collaboration
1. Awareness
2. Coordination
3. Active sharing
4. Joint activities
131
Collaborative Culture
132
Approach to perceiving a
situation and processing
information
Influence on Collaborative
Intelligence Analysis
Collaboration must allow
interaction between those who
think aloud and those who
think alone
Tools must provide for the capture
of introverted thinking and
extroverted speaking
Approach to life
organization and interface
to the outer world
133
Collaborative Environments
Collaborative environments describe the physical, temporal, and functional setting within which organizations interact. Through the 1990s, traditional physical environments were augmented by (and in some cases replaced by) virtual
environments, which were enabled by computation and communication to
transcend the time and space limitations of physical meeting places. The term
collaborative virtual environment (CVE) represents this new class of spaces,
broadly defined as:
A Collaborative Virtual Environment is a computer-based, distributed, virtual space or set of places. In such places people can meet and interact with
others, with [computer] agents or with virtual objects. CVEs might vary in
their representational richness from 3D graphical spaces, 2.5D and 2D
environments, to text-based environments. Access to CVE is by no means
limited to desktop devices but might well include mobile or wearable
devices, public kiosks, etc. [32].
134
Place of collaboration
Different
Same
In-person
face-to-face
Collocated;
immediate
physical
access
Physical
meetings
Electronic support
systems for
physical meetings
E-mail
Group workspaces
sharing data (files),
calendar and
applications (tools)
Bulletin boards
Virtual teams
Considerable on-line
team communication,
interaction, information
sharing
Intermittent on-line
access to team members
E-mail
Teleconference
Group workspaces On-line meetings
Videoconference
Instant messaging; sharing data (files),
on-line chat spaces calendar and
On-line application applications (tools)
Bulletin boards
sharing
Synchronous
collaboration
Asynchronous
collaboration
135
Table 4.9
Basic Groupware Functions
Collaboration
Time Mode
Synchronous
Collaboration
Function
Text conference
Function Description
One-to-one or one-to-many interactive text among multiple
participants (chat room)
Synchronous
and
Asynchronous
Asynchronous
Workspaces
(virtual rooms)
Whiteboard
graphics tool
File sharing
Application
sharing
Participation
administration
Security
Scheduling
Workflow
management
Asynchronous
workspaces
(virtual rooms)
136
Group authoring
Presence
awareness
Instant message
Bulletin board
Because any large network of diverse contributors may include users with different commercial collaboration products, interoperability between groupware
applications is a paramount consideration. A U.S. interoperability study of synchronous and asynchronous collaboration between commercial products evaluated
the principal combinations of two basic implementation options [27]:
Web-based interoperation, in which all users collaborate via a standard
Internet browser client. (The client may include unique Java applets,
which run on the browsers Java virtual machine, or plug-in
applications.)
Application-based interoperation, in which all users collaborate via
T.120/H.323 standardsbased applications (i.e., application-unique
software that may be required both at the browser or the server, or both).
Of course, the Web-based modes offer greater interoperability with more
limited performance (due to more restrictive and limited standards) than the
dedicated-standards-based products. The study highlighted the difficult trades
between connectivity, flexibility, and platform diversity on one hand and security, technical performance, and collaboration policy control on the other hand.
The study noted that intelligence users expect different physical settings when
moving from agency to agency for physical meetings, and it is reasonable to
expect (and tolerate) a degree of differences in virtual settings (e.g., tool look and
feel while using different collaboration applications) when joining different collaboration environments across the community of users.
Collaborative groupware implementations offer two alternative approaches
to structuring groups:
Centralized collaboration. Groupware based on a central collaboration
server, with distributed software clients, provides the basis for large-
137
The U.S. IC has identified the specific kinds of collaboration that it seeks to
achieve, especially between consumers and the collection, operations, and analysis disciplines:
We aim to narrow the gap between intelligence collectors and intelligence
consumers. How do we do this? One way is to assign more DI [Directorate
of Intelligence] experts to policy agencies and to negotiating teams. We
now have dozens of DI officers dispersed throughout the policy community, offering policymakers one stop shopping for intelligence analysis.
We have also taken steps, but have a long way to go, to link CIA with
138
139
DM
1
2
LDR
All-source
analysis
team
AA
Collection
coordination
management
6
CM
Release approval
AA
8
AA
AA
5
SSA Cross-cueing
5
Planning
tools
Analysis tools
Processing
Digital
production
Custom
intelligence
portal
SSA
Analysis tools
Processing
Data
Source 1
Data
Source 2
Digital
production
analysis
integration,
update
Analysis
base
Intelligence
database
Analysis and
production
workflow
imagery, signals, or MASINT) are specialists that analyze the raw data
collected as a result of special tasking; they deliver reports to the allsource team and certify the conclusions of special analysis.
Collection managers. The collection managers translate all-source
140
The numbered paths in Figure 4.6 represent only the major knowledge-sharing
transactions, described in the following numbered list, which illustrate the collaboration process in this example. At the right side of the figure is the supporting analysis and production system workflow, which maintains the collected
base of raw data, analyzed and annotated information, and analysis results.
1. Problem statement. The State Department decision maker (DM), the
ultimate intelligence consumer, defines the problem. Interacting with
the all-source analytic leader (LDR)and all-source analysts on the
analytic teamthe problem is articulated in terms of scope (e.g., area
of world, focus nations, and expected depth and accuracy of estimates), needs (e.g., specific questions that must be answered and policy issues) urgency (e.g., time to first results and final products), and
expected format of results (e.g., product as emergent results portal or
softcopy document).
2. Problem refinement. The analytic leader (LDR) frames the problem
with an explicit description of the consumer requirements and intelligence reporting needs. This description, once approved by the consumer, forms the terms of reference for the activity. The problem
statement-refinement loop may be iterated as the situation changes or
as intelligence reveals new issues to be studied.
3. Information requests to collection tasking. Based on the requirements,
the analytic team decomposes the problem to deduce specific elements
of information needed to model and understand the level of trafficking. (The decomposition process was described earlier in Section 2.4.)
The LDR provides these intelligence data requirements to the collection manger (CM) to prepare a collection plan. This planning requires
the translation of information needs to a coordinated set of datacollection tasks for humans and technical collection systems. The CM
prepares a collection plan that traces planned collection data and
means to the analytic teams information requirements.
4. Collection refinement. The collection plan is fed back to the LDR to
allow the analytic team to verify the completeness and sufficiency
of the planand to allow a review of any constraints (e.g., limits to
coverage, depth, or specificity) or the availability of previously collected relevant data. The information requestcollection planning and
5.
6.
7.
8.
141
142
143
144
duced in the classic text The Rational Manager [38] and taught to generations of managers in seminars, has been applied to management,
engineering, and intelligence-problem domains. This method carefully
distinguishes problem analysis (specifying deviations from expectations,
hypothesizing causes, and testing for probable causes) and decision
analysis (establishing and classifying decision objectives, generating
alternative decisions, and comparing consequences).
Multiattribute utility analysis (MAUA). This structured approach to
145
Table 4.10
Basic Elements of Critical Thinking Methodology
Problem-Solving Stages
4. Judgment to evaluate
arguments and make decisions
146
Evaluation metrics
Problem assessment
Structure the problem
Quantify decision (solution) criteria
Alternative analysis
Problem decomposition
Conceive solution
options
Identify factors
Synthesize model
Decision analysis
Analyze model, factors, effects
Measure cost, risk, performance
Solution evaluation
Quantify uncertainty, preferences
Solution
147
Systems Thinking
148
Approach
Basis of the
Approach
Analytic Thinking
Systems Thinking
Reductionismdecompose system
into component parts and
relationships; study the behaviors of
components then integrate to
explain large-scale system behavior
The central distinctions in these approaches are in the degree of complexity of the system being studied, defined by two characteristics:
1. Number of independent causes;
2. Number of relationships between those causes.
Complexity refers the property of real-world systems that prohibits any formalism to represent or completely describe its behavior. In contrast with simple
systems that may be fully described by some formalism (i.e., mathematical equations that fully describe a real-world process to some level of satisfaction for the
problem at hand), complex systems lack a fully descriptive formalism that captures all of their properties, especially global behavior. Dynamic systems are
described variously; the most common range of terms include:
Simple. These are dynamic systems whose local and global behavioral
149
decomposed) systems, which lack a fully descriptive formalism that captures all of its properties, especially global behavior. These systems may
be linear and deterministic.
Chaotic. Chaotic systems are nonlinear deterministic systems that only
appear to be random and are characterized by sensitivity to initial conditions and instability.
Random. Random systems are maximally complex.
Reductionist approaches have faired well in explicitly modeling simple systems down to the molecular level in classical physics (e.g., Newtons laws), in
predicting planetary motion, in describing fundamental processes in chemistry,
and in the engineering description and simulation of highly complicated electronic and mechanical systems. But systems of subatomic scale, human organizational systems, and large-scale economies, where very large numbers of
independent causes interact in large numbers of interactive ways, are characterized by inability to model global behaviorand a frustrating inability to predict
future behavior. These systems are best understood by a systems-thinking
approach that focuses on the study of these high-level emergent behavior patterns, rather than attempting to decompose to infinite detail. We often speak of
a subject matter experts broad experience, deep insight, or wisdom to
refer to the unspoken or unarticulated (tacit) knowledge that is appliedand
often not fully justifiedin the analysis of a complex problem. The experts
judgment is based not on an external and explicit decomposition of the problem, but on an internal matching of high-level patterns of prior experience with
the current situation. The experienced detective as well as the experienced analyst applies such high-level comparisons of current behaviors with previous tacit
(unarticulated, even unconscious) patterns gained through experience.
Of interest to the intelligence analyst is the value of holistic study of such
complexity to gain insight into the evaluation of complex situations and in the
study of strategic surprise. John Casti, in Complexification: Explaining a Paradoxical World through the Science of Surprise, has suggested that complexity science offers insight into the general causes of surprisewhen our pictures of
reality depart from reality itself [44]. The use of agent-based simulation tools to
create and study emergent behaviors in support of intelligence analysis are
described in Chapter 8. Such tools effectively enhance the experience of the analyst by simulating the interaction of many agents (independent but highly interactive causes) over a wide range of conditions to explore the complex space of
150
It is important to recognize that humans often make decisions much less formally than by the methods just described. Indeed, in times of crisis, when time
does not permit the careful methodologies, humans apply more naturalistic
methods that, like the systems-thinking mode, rely entirely on the only basis
availableprior experience.
The U.S. Commander of Operation Noble Anvil in the 1999 conflict in
Kosovo commented candidly on the effects of IT applications in his stressful
command and control environment, Great technologybut needs controls
Admiral James O. Ellis (CINC USN Europe) went on to note that information
saturation adds to the the fog of war and that the demand for information will
always exceed the capability to provide it, but asked the question, how
much is enough? The admiral clearly recognized the potential for critical information, but acknowledged that that his KM systems were flooding him with
more information than his people could apply: Uncontrolled, [information]
will control you and your staffs and lengthen your decision-cycle times.
(Insightfully, the Admiral also noted, You can only manage from your Desktop
Computer you cannot lead from it [45].)
Studies of such naturalistic decision-making environments have sought to
develop aids to cognition for both rapid, comprehensive situation awareness and
decision support. Kline et al. [46] have characterized the general approaches that
humans apply to such tasks:
Decision makers experience the situation (holistically) and match the
151
the focus of energy is placed on elaborating and refining the best immediate approach (often the first imagined option) rather than on creating
a diverse set of options.
Mental simulation is applied, based on the decision makers prior expe-
rience, where the decision maker imagines or envisions the likely outcomes of actions.
The focus of attention is placed more on assessing the situation and act-
152
analytic tradecraft were collected in technical notes used for training. A Compendium of Analytic Tradecraft Notes was published in 1996 as a text of analytic best
Table 4.12
An Analytic Tradecraft Taxonomy
Analytic Tradecraft Area
Addressing national interests Methods to view the analytic problem from the policymakers
(consumers) perspective
Identifying analysis value-added contribution
Criteria to know policy issues and provide optionswithout making
policy recommendations
Test to be applied to draft reports: so-what and action-support tests
Access and credibility
Articulation of assumptions
Outlook
Analytic expertise
Effective summary
Implementation analysis
Counter intelligence
153
practices [47]. These best practices summarize the characteristics of best analyses, providing supporting examples as appropriate. The Tradecraft Notes provide
a taxonomy of basic analytic practices; Table 4.12 enumerates the initial 10
declassified categories, illustrating key best-practice areas within the Directorate
of Intelligence. The CIA maintains a product evaluation staff to evaluate intelligence products, learn from the large range of products produced (estimates,
forecasts, technical assessments, threat assessments, and warnings) and maintains the database of best practices for training and distribution to the analytic
staff.
4.6 Summary
In this chapter, we have introduced the fundamental cultural qualities, in terms
of virtues and disciplines that characterize the knowledge-based intelligence
organization. The emphasis has necessarily been on organizational disciplineslearning, collaborating, problem solvingthat provide the agility to
deliver accurate and timely intelligence products in a changing environment.
The virtues and disciplines require supporttechnology to support collaboration over time and space, to support the capture and retrieval of explicit knowledge, to enable the exchange of tacit knowledge, and to support the cognitive
processes in analytic and holistic problem solving.
In subsequent chapters, we will detail these supporting technologies and
their application in the KM environment, but first we examine the core of the
learning organizations knowledge-creating process: the analysis of intelligence
data and the synthesis of intelligence products. In the next two chapters, we
describe these core knowledge-creating processes and their implications for
implementation in the KM environment. In subsequent chapters, we will introduce the tools, technology, and enterprise infrastructure necessary to support
the intensely human analysis-synthesis processes.
Endnotes
[1]
See, for example, Stewart, T. A., The Wealth of Knowledge: Intellectual Capital and the 21st
Century Organization, New York: Currency-Doubleday 2002; Dixon, N. M., Common
Knowledge: How Companies Thrive by Sharing What They Know, Boston: Harvard Business
School Press, 2000; ODell, C., et al., If Only We Knew What We Know: The Transfer of
Internal Knowledge and Best Practice, New York: Free Press, 1998; Garvin, D. A., Learning
in Action: A Guide to Putting the Learning Organization to Work, Boston: Harvard Business
School Press, 2000.
[2]
The DCI Strategic Intent was published in March 1999. The five major objectives of the
Intent and activities in each area were published in the unclassified Annual Report for the
154
[3] Gannon, J. C., Strategic Use of Open Source Intelligence addressed to the Washington
College of Law at American University, Washington, D.C., October 6, 2000.
[4] Leonard-Barton, D., Wellsprings of Knowledge, Boston: Harvard Business School Press,
1995, pp. 2426, 5153.
[5] See Virtue Epistemology in Stanford Encyclopedia of Philosophy, accessed on-line on February 18, 2001 at http://plato.stanford.edu/entries/epistemology-virtue.
[6] Winograd, T., and F. Flores, Understanding Computers and Cognition, New York: Addison
Wesley, 1995.
[7] Wisdom refers to the ability to apply knowledge to select the best ends and to choose the
best means to accomplish those ends.
[8] Janis, I., Victims of Groupthink: Psychological Study of Foreign-Policy Decisions and Fiascoes,
(2nd ed.), Boston: Houghton Mifflin, 1972.
[9] Davenport, T. and Prusak, L., Working Knowledge: How Organizations Manage What They
Know, Boston: Harvard Business School Press, 1998.
[10] Quoted in CIO Magazine, August 15, 2001, p. 118.
[11] Definition by the APQC, www.apqc.org.
[12] Varon, E., The Langley Files, CIO, August 1, 2000, pp. 126130.
[13] Karlenzig, W., Senge on Knowledge, Knowledge Management, July 1999, pp. 2223.
[14] Brown, J. S., and P. Duguid, The Social Life of Information, Cambridge, MA: Harvard
Business Review, 2000.
[15] This figure is adapted from Earl, M., and I. Scott, What Is a Chief Knowledge Officer?
Sloan Management Review, Winter 1999, pp. 2938.
[16] Denning, S., The Springboard: How Storytelling Ignites Action in Knowledge-Era Organizations, Boston: Butterworth Heinemann, 2001, pp. xviixviii.
[17] The organic school complements the mechanistic school, which approaches organizations
as rational models subject to quantitative analysis and scientific management.
[18] Barth, S., The Organic Approach to the Organization: A Conversation with KM Practitioner David Snowdon, Knowledge Management, October 2000, p. 24.
[19] Senge, P., The Fifth Discipline, New York: Doubleday, 1990, p. 3. See also Senges more
recent book, The Dance of Change: The Challenges to Sustaining Change in Learning
Organizations, New York: Currency-Doubleday, 1999.
[20] Bloom, B. S., B. B. Mesia, and D. R. Krathwohl, Taxonomy of Educational Objectives, Volumes 1 and 2, New York: David McKay, 1964.
[21] Curtis, B., W. Hefley, and S. Miller, People Capability Maturity Model (P-CMM),
Version 2.0, CMU/SEI 2001-MM-01, Carnegie-Mellon Software Engineering Institute,
July 2001.
155
[22] Garvin, D., Building a Learning Organization, in HBR on Knowledge Management, Boston: HBR Press, 1998, pp. 4780.
[23] Definition adopted by the APQC. See Benchmarking: Leveraging Best-Practice Strategies, APQC, 1995. Accessed on-line on March 8, 2002, at http://apqc.org/free/whitepapers/dispWhitePaper.cfm?Product ID=663.
[24] Davenport, T. and Prusak, L., Working Knowledge: How Organizations Manage What They
Know, Chapter 5, Knowledge Transfer, Boston: Harvard Business School Press, 1998,
pp. 88106.
[25] The terminology here is adopted from Distance Learning, Army Science Board, 1997 Summer Study Final Report, December 1997, pp. 67, 16.
[26] This is a question of the degree to which groups of individuals can think collectively and
coherently. The fact and efficiency of group communication, interaction, influence, and
coordination is not in question; the existence or meaning of group cognition (groupthink)
is the issue in question. See the divergent views in Smith, J. B., Collective Intelligence in
Computer-Based Collaboration, Mahwah, NJ: Lawrence Erlbaum Assoc., 1994; and Newell, A., Unified Theories of Cognition, Cambridge, MA: Harvard University Press, 1990.
[27] Intelligence Community Collaboration, Baseline Study Final Report, MITRE, December
1999, Section 3.2.2, accessed on-line on April 20, 2002, at http://collaboration
.mitre.org/prail.
[28] Fisher, B. A., and D. Ellis, Small Group Communication (3rd ed.), New York: McGrawHill, 1990.
[29] By mindset, we refer to the distillation of the intelligence analysts cumulative factual and
conceptual knowledge into a framework for making estimative judgments on a complex
subject. Mindset includes a commitment to a reference viewpoint on a subject; creating a
mindset is a vital and indispensable element of human reasoning but introduces a bias
against contradictory evidence or competing mindsets. See Davis, J., Combating MindSet, Studies in Intelligence, Vol. 36, No. 5, 1992.)
[30] The Myers-Briggs Type Indicator is a psychological instrument to characterize an individuals personality type based on a generalization of Carl Jungs (18751961) psychology
of personality types. The MBTI is a questionnaire trademarked and copyrighted by
Consulting Psychological Press (1962) that may be used to define a Myers-Briggs type of
an individual. The Keirsey temperament model (1978) is a similar classification scheme
that distinguishes more subtle features of temperament (interests, orientation, values, selfimage, and social roles).
[31] See, for example, the analysis of a collaborating group in Leonard, D., and S., Straus,
Putting Your Companys Whole Brain to Work, in HBS on Knowledge Management,
Boston: HBR Press, 1998, pp. 135136.
[32] Churchill, E., D. Snowdon, and A. Munro (Eds.), Collaborative Virtual Environments,
Heidelberg, Germany: Springer Verlag, 2002.
[33] Groove Product Backgrounder White Paper, Groove Networks, Inc., 2001, accessed online on February 15, 2002 at http://www.groove.net/pdf/backgrounder-product.pdf.
156
[34] Intelligence Analysis for the 21st Century, speech by John C. Gannon, (Former) Deputy Director for Intelligence, CIA, at The Fletcher School of Law and Diplomacy, Tufts
University, November 18, 1997, accessed on-line on January 31, 2002, at http://www
.odci.gov/cia/di/speeches/42826397.html.
[35] Richard, A. O., International Trafficking in Women to the United States: A Contemporary
Manifestation of Slavery and Organized Crime, U.S. Dept of State Bureau of Intelligence
and Research, DCI Center for the Study of Intelligence Monograph, Washington, D.C.:
CIA, November 1999.
[36] We purposely include predominantly convergent (thinking to narrow choices to arrive at a
solution), rather than divergent (exploratory or creative thinking) methodologies of
thought here. Divergent creativity is required to synthesize hypotheses, and lateral creative
thinking methods (e.g., see Edward DeBonos Serious Creativity, 1992) are certainly
required within the processes described in this chapter.
[37] Descartes, R., Discourse on Method, 1637; his four-step problem-solving method of analysis and synthesis is described in Part IIPrincipal Rules of the Method.
[38] Kepner, C. H., and B. B. Tregoe, The Rational Manager, Princeton, NJ: Kepner-Tregoe,
1965.
[39] Waltz, E., and J. Llinas, Multisensor Data Fusion, Boston: Artech House, 1990,
pp. 419-423.
[40] Sawka, K., Competing Hypothesis Analysis: Building Support for Analytic Findings,
Competitive Intelligence Magazine, Vol.3, No. 3, JulySeptember 1999.
[41] Lockwood, J., and K. Lockwood, The Lockwood Analytic Method for Prediction
(LAMP), Defense Intelligence Journal, Vol. 3, No.2, Fall 1994, pp. 4774.
[42] Decision analysis is an entire discipline of its own, encompassing a broad set of quantitative analysis approaches to arrive at decisions. For an overview of this field, see Watson, S.
R., and D., M. Buede, Decision Synthesis, Cambridge, England: Cambridge University
Press, 1987.
[43] Philosophers characterize a number of approaches to thinking about the world and attribute these classics to their most well-known teachers: Isaac Newton (16421727) introduced the concept of mechanism that explained causality in the physical world, Ren
Descartes (15961650) introduced reductionism to decompose systems into parts for independent analysis, and Francis Bacon (15611626) introduced empiricism, whereby the systems of nature are observed and explanations (hypotheses) synthesized and then tested (the
scientific method).
[44] Casti, J. L., Complexification: Explaining a Paradoxical World Through the Science of Surprise, New York: Harper Collins, 1994, p. 3.
[45] Ellis, J. O., CINC U.S. Naval Forces Europe, Commander Joint Task Force Noble Anvil,
A View From the Top, After Action Briefing, July 4, 1999, accessed on-line on May 25,
2002, at http://www.d-n-i.net/fcs/ppt/ellis_kosovo_aar.ppt.
[46] Kline, G. A., Judith Orasanu, and Roberta Calderwood (eds.), Decision Making in Action,
Norwood, NJ: Ablex, 1993.
157
[47] CIA Opens Door on the Craft of Analysis, Center For Study of Intelligence Newsletter,
No. 7, Winter-Spring 1997. The CIA Directorate of Intelligence made available to the
public a reprinted and revised edition of A Compendium of Analytic Tradecraft Notes, Volume I (Notes 110), which have become a standard reference within CIA for practitioners
and teachers of intelligence analysis. The revised compendium contains 10 Tradecraft Notes
issued to analysts during MarchDecember 1995.
5
Principles of Intelligence Analysis and
Synthesis
A timeless Sidney Harris cartoon depicts a long-haired professor posed in front a
blackboard filled with mathematical equations by his grad student. Neatly
scrawled between the board-length equation and the solution is the simple statement, then a miracle occurs. The skeptical professor admonishes his student, I think you should be more explicit here in step two. In this chapter, we
will explain the step two of the KM process: the jump from accumulated
information to the creation of knowledge. At the core of all knowledge creation
are the seemingly mysterious reasoning processes that proceed from the known
to the assertion of entirely new knowledge about the previously unknown. For
the intelligence analyst, this is the process by which evidence [1], that data determined to be relevant to a problem, is used to infer knowledge about a subject of
investigationthe intelligence target. The process must deal with evidence that
is often inadequate, undersampled in time, ambiguous, and carries questionable
pedigree.
We refer to this knowledge-creating discipline as intelligence analysis and
the practitioner as analyst. But analysis properly includes both the processes of
analysis (breaking things down) and synthesis (building things up). In this chapter, the analytic-synthetic process of reasoning about data to produce knowledge
(intelligence) is introduced from a theoretical and functional point of view. Following this chapter on the principles of analysis-synthesis, we will move on to
discuss the practice with practical implementations and applications in the subsequent chapter.
159
160
161
resulting State Department intelligence report [2] provides insight into how the
analysis-synthesis process might proceed within the intelligence cycle introduced in Chapter 2:
Collection planning (analysis). The analyst examines the preexisting evi-
162
hypothesis is created (by the inductive analysis process or by pure inspiration) to explain the observed phenomena.
163
Analysis
Synthesis
The
Reasoning
Processes
Data
(evidence)
Analyze: Break
evidence into parts,
sort, and identify
relationships
Synthesize
Assemble parts
into larger constructs
Solutions
164
Category
Synthetic
Analytic
Basis of the
proposition
Proposition
category
examples
Synthetic a priori:
geometry,
mathematics
Analytic a priori:
fundamental logic
derived by deduction
from a priori
propositions
Of course the intelligence analyst, like the scientist, applies both analysis
and synthesis in a complementary fashion to study the components of evidence,
assemble candidate hypotheses, test them against the evidence available, and
seek additional evidence to confirm, reinforce, or eliminate hypotheses. The
analyst iteratively applies analysis and synthesis to move forward from evidence
and backward from hypothesis to explain the available data (evidence). In the
process, the analyst identifies more data to be collected, critical missing data,
and new hypotheses to be explored. This iterative analysis-synthesis process provides the necessary traceability from evidence to conclusion that will allow the
results (and the rationale) to be explained with clarity and depth when
completed.
But the intelligence analyst is only one contributor in a larger chain of
analysis-synthesis operations, which leads to national decisions and subsequent
actions. Consider the practical sequence of analysis-synthesis processes that are
partitioned between intelligence, operations, and policy. The typical reasoning
sequence (Figure 5.2) includes three distinct functions (often performed by
three distinct organizations), each requiring an analysis-synthesis loop:
Intelligence analysis. Intelligence collects and breaks down data, guided
165
Resources
Context
Data
Planning
Hypotheses
Decision Making
Values
A
Decision
Decisions
H1, H2, Hn
Courses of action
COA1, COA2,
COAm
D1, Dp
Intelligence
Operations
Policy
reference. From this data, hypotheses (explanations or models) are synthesized, ranked, and reported in the intelligence report.
Planning. Operations accepts the intelligence report and analyzes the
implications of the hypothesized situation before synthesizing (planning) feasible COAs or responses. These responses depend on the
resources available.
Decision making. Policy makers consider the possible COAs in the context of values (cost, risk) to determine the utility of each alternative and
make decisions based on a rational selection of highest utility.
Figure 5.2 does not include the feedback loops, which naturally occur
between these functions, that represent the collaborative interactions between
analysts, operations personnel, and decision makers. These additional interactions between participants provide the necessary context that allows upstream
analysts to focus their efforts to satisfy downstream users. The distinctions in
each of these three areas of analysis are summarized in Table 5.2, but the basis of
analytic-synthetic reasoning within each is the same. While depicted as a
sequential chain, the three functions must collaborate to provide correct
upstream insight to focus collection and analysis toward downstream decision
making.
The careful distinctions between intelligence and operations or policymaking have long been an area of sensitive discussion in the U.S. government,
from the inception of the IC up to today [3]. Sherman Kent, pioneer of American intelligence and author of Strategic Intelligence for American World Policy
166
Discipline
Intelligence Analysis
Operational Analysis
Decision Analysis
Analytic Focus
Analysis
Process
Judge large-scale
implications of COAs on
mission objectives
Synthesis
Process
Create hypotheses
(models, explanations,
situations)
Create COAs
Fields of Study
Intelligence analysis
Operations research
(1949), has been noted for his firm position that Intelligence must be close
enough to policy, plans and operations to have the greatest amount of guidance,
and must not be so close that it loses its objectivity and integrity of judgement
[4]. The potential power of intelligence to influence policy was noted in a CIA
report discussing intelligence provided to former Secretary of State Henry
Kissinger:
Kissinger has written perceptively of the challenge a DCI faces in walking
the fine line between offering intelligence support and making policy recommendations. Probably more than any other National Security Adviser,
he was sensitive to the reality that an assessment of the probable implications of any U.S. action can come across implicitly or explicitly, intended or
not, as a policy recommendation. He wrote in White House Years, It is to
the Director [of Central Intelligence] that the assistant first turns to learn
the facts in a crisis and for analysis of events, and since decisions turn on the
perception of the consequences of actions the CIA assessment can almost
amount to a policy recommendation [5].
U.S. President George H.W. Bush summed up the general executive office
perspective of the distinction: And when it comes to the mission of CIA and
the Intelligence Community, [Director of Central Intelligence] George Tenet
has it exactly right. Give the President and the policymakers the best possible
intelligence product and stay out of the policymaking or policy implementing
except as specifically decreed in the law [6].
167
beliefs are asserted. The process may move from specific (or particular)
beliefs toward more general beliefs, or from general beliefs to assert
more specific beliefs.
Products. The certainty associated with an inference distinguishes two
categories of results of inference. The asserted beliefs that result from
inference may be infallible (e.g., an analytic conclusion is derived from
infallible beliefs and infallible logic is certain) or fallible judgments (e.g.,
a synthesized judgment is asserted with a measure of uncertainty;
probably true, true with 0.95 probability, or more likely true than
false).
The most basic taxonomy of inferential reasoning processes distinguishes
three basic categories of reasoninginduction, abduction, and deduction.
Table 5.3, similar to Table 5.1 with Kants distinctions, provides the structure
of this taxonomy and distinguishes the characteristics of each. The form of each
method is represented as a common logical syllogism to allow each form to be
seen in light of common everyday reasoning from premises to conclusion.
It is worth noting that while induction and deduction are the classical formal reasoning forms found in most philosophy and logic texts, abduction is the
more recent pragmatic form of reasoning introduced by mathematician and
logician C. S. Peirce (18391914). Abduction is the less formal but more common approach of inference to achieve the best explanation with uncertain
evidence.
5.2.1
Deductive Reasoning
168
Inference:
Reasoning Processes That Create and Modify Belief
Inferential
Process
Induction
Inductive
generalization:
reasoning to apply
belief about an
observed sample
to an entire
population
All observed As
Syllogistic
are Bs
Representation
Therefore all
As are Bs
Product:
Fallibility of
Asserted
Beliefs
Process:
Motion of
Reasoning
Inductive projection:
reasoning to apply a
belief about an
observed population
to a future sample
Abduction:
reasoning to
create the best
explanation of
evidence
Deduction:
reasoning about
premises to
derive
conclusions
D is a collection A or B or C or...
of data
but not B or C or...
H1, H2, ... Hn ex- Therefore, A
Therefore the
next observed A will plains D best
be a B
Therefore,
accepts H k
Detection:
conclusions are no
stronger than the
premises;
preserves
infallible
knowledge
Texts on formal logic present the variety of logical systems that may be
defined to provide foundations for deductive inference [8]. The classical propositional logic system (or calculus) described in Table 5.4 is the basic deductive
tool of formal logic; predicate calculus is the system of mathematics that extends
deductive principles to the quantitative realm.
5.2.2
Inductive Reasoning
169
Table 5.4
Several Basic Deductive Propositional Argument Forms
Argument
Form
Simple Example
Modus ponens
Q conclusion
Modus tollens
Hypothetical
syllogism (chain
argument)
Aircraft A is a fighter
If an aircraft has a type 55 radar, it is a fighter
Aircraft A is not a fighter
Aircraft A does not have a type 55 radar
If an aircraft has a type 55 radar, it is a fighter
If an aircraft is a fighter it has weapons
If an aircraft has weapons it is a threat
If an aircraft has a type 55 radar, it is a
threat
PS conclusion
Disjunctive
syllogism
P conclusion
Symbols used:
PQ means if P (antecedent) is true, then Q (consequent) is true.
P Q means either P or Q are true.
P means negation of the premise that P is not true.
designates therefore, and is followed by the conclusion.
Induction moves from specific beliefs about instances to general beliefs about
larger and future populations of instances. It is a fallible means of inference.
The form of induction most commonly applied to extend belief from a
sample of instances to a larger population, is inductive generalization:
All Observed As are Bs
all As are Bs.
By this method, analysts extend the observations about a limited number
of targets (e.g., observations of the money laundering tactics of several narcotics
rings within a drug cartel) to a larger target population (e.g., the entire drug
cartel).
170
171
In each of these three cases, Koestler points out that the underlying inductive act is the sudden discovery of a new or novel pattern, previously hidden in
the details, to the discoverer. Koesler graphically illustrated the process in a geometric analogy. To understand his analogy, consider the ha ha discovery process
by representing the story line of a humorous story (Figure 5.3). The sequence of
evidence is the series of facts (D1, D2,) sequentially presented in the story.
These facts are projected onto an immediate or common explanationa frame
of discernment to interpret the factsrepresented as the plane, A. But, as the
punch line (the last piece of evidence, D5) is introduced, it suddenly reveals an
entirely different plane in which all of the evidence perfectly fitsrevealing a
hidden but parallel and ironic explanation for the story. In the geometric analogy, a sinusoid is revealed to fit the data. The cognitive-emotive reaction to the
sudden realization is to exclaim ha ha!
Koestler uses the term bisociation to describe the process of viewing multiple explanations (or multiple associations) of the same data simultaneously. In
the example in the figure, the data can be projected onto a common plane of discernment in which the data represents a simple curved line; projected onto an
Common explanation
Creative discovery
Novel frame of
discernment, B
D2
D1
D3
D4
D1
D2
D4
D3
D5
D5
Common frame of
discernment, A
Last
Evidence
172
(by traffic analysis) and money transfers of a trading firm may lead to
the discovery of an organized crime operation.
A single anomalous measurement may reveal a pattern of denial and
deception to cover the true activities at a manufacturing facility in
which many points of evidence, are, in fact deceptive data fed by the
deceiver. Only a single piece of anomalous evidence (D5 in the figure)
is the clue that reveals the existence of the true operations (a new plane
in the figure). The discovery of this new plane will cause the analyst to
search for additional supporting evidence to support the deception
hypothesis.
Each frame of discernment (or plane in Koestlers metaphor) is a framework for creating a single or a family of multiple hypotheses to explain the evidence. The creative analyst is able to entertain multiple frames of discernment,
alternatively analyzing possible fits and constructing new explanations, exploring the many alternative explanations. This is Koestlers constructive-destructive
process of discovery.
Koestlers work, and Thomas Kuhns classic, The Structure of Scientific
Revolutions [11], both attempt to understand the point of creative inspiration
that enables inductive discovery. Kuhn argued that scientific discovery resulted
from the crisis when anomalies (e.g., experimental results that failed to fit the
accepted paradigm) challenged the belief in the current paradigm. These crises
caused searches for new explanations of the anomalous phenomena and resulted
in the discovery of new all-encompassing paradigms. Kuhn referred to philosopher Michael Polanyis (then) pioneering works [12] in tacit knowledge to
explain how these crises led scientists to develop shared beliefs and internal personal knowledge (i.e., tacit knowledge), which led them to discovery. The
essence of both scientific discovery and intelligence analysis depends on the creative conception of hypotheses and the subsequent testing of those hypotheses by
either or both of two means [13]:
1. Confirmation of hypotheses;
173
Abductive Reasoning
174
175
Abduction also poses the issue of defining which hypothesis provides the best
explanation of the evidence. The criteria for comparing hypotheses, at the most
fundamental level, can be based on two principle approaches established by philosophers for evaluating truth propositions about objective reality [18]. The correspondence theory of the truth of a proposition p is true is to maintain that p
corresponds to the facts. For the intelligence analyst this would equate to
hypothesis h corresponds to the evidenceit explains all of the pieces of evidence, with no expected evidence missing, all without having to leave out any
contradictory evidence. The coherence theory of truth says that a propositions
truth consists of its fitting into a coherent system of propositions that create the
hypothesis. Both concepts contribute to practical criteria for evaluating competing hypotheses (Table 5.5).
In the next chapter, we will introduce the practical implementation of
abduction in the methodology of alternative competing hypotheses (ACH). We
now turn to integrating these formal and informal methods of reasoning for
practical analysis-synthesis in the intelligence problem-solving environment.
176
Basis of Truth
Correspondence
Hypothesis Testing
Criteria
The hypothesis
corresponds to all
of the data
Coherence
describe how the fundamental inference methods are notionally integrated into
the intelligence analysis-synthesis process.
We can see the paths of reasoning in a simple flow process (Figure 5.4),
which proceeds from a pool of evidence and a question (a query to explain the
evidence) posed about the evidence. This process of proceeding from an evidentiary pool to detections, explanations, or discovery has been called evidence marshaling because the process seeks to marshal (assemble and organize) into a
representation (a model) that:
Detects the presence of evidence that match previously known premises
177
Deduction
Assess
multiple hypothesis
likelihoods
Hypothesis
testing fit to
known templates
Search for
evidence to affirm
new hypothesis
Pool
of
evidence
Conjecture new
hypothesis beyond the
current frame of
discernment
Detection
Assessment criteria:
Correspondence
Coherence
Pragmatic
Retroduction
Conjecture
best explanation
of evidence
Assemble
evidence to best
frame of
discernment
Abduction
Conjecture to
form a generalized
hypothesis
Induction
Assemble
evidence to the
generalized frame
of discernment
Assess
multiple
hypothesis
likelihoods
Explanation
Validate
new hypothesis
likelihood
Discovery
hypotheses simultaneously, the likelihood of each hypothesis (determined by the strength of evidence for each) is assessed and reported.
(This likelihood may be computed probabilistically using Bayesian
methods, where evidence uncertainty is quantified as a probability and
prior probabilities of the hypotheses are known.)
2. Retroduction. This feedback path, recognized and named by C.S.
Peirce as yet another process of reasoning, occurs when the analyst
conjectures (synthesizes) a new conceptual hypothesis (beyond the current frame of discernment) that causes a return to the evidence to seek
evidence to match (or test) this new hypothesis. The insight Peirce
provided is that in the testing of hypotheses, we are often inspired to
realize new, different hypotheses that might also be tested. In the early
implementation of reasoning systems, the forward path of deduction
was often referred to as forward chaining by attempting to automatically fit data to previously stored hypothesis templates; the path of
178
179
Deduction
Do I find any
evidence of patterns
that fit previous,
known attacks?
Do I find any
evidence that
1
fits new attack
pattern for other
kind of targets?
Pool
of
evidence
Evidence
approximately
fits several
other kinds
of target
attacks.
Conjecture a
general pattern
of attacks
across all
target
categories
Assess
multiple hypothesis
likelihoods
Hypothesis
testingfit to
known templates
Search for
evidence to affirm
new hypothesis
Conjecture new
hypothesis beyond the
current frame of
discernment
Retroduction
5
Conjecture
best explanation
of evidence
Assemble
evidence to best
frame of
discernment
Induction
Assemble
evidence to the
generalized frame
of discernment
Assess
multiple
hypothesis
likelihood's
Abduction
Conjecture to
form a generalized
hypothesis
Detection
Explanation
Validate an overall
attack strategy, plan,
and MO process
Validate
new hypothesis
likelihood
Discovery
180
on known signatures. Detect potentially important deviations, including anomaly detection of changes relative to normal or expected state
or change detection of changes or trends over time.
Discovery. Detect the presence of previously unknown patterns in data
entity or event.
Prediction. Anticipate future events based on detection of known indi-
cators; extrapolate current state forward, project the effects of linear factors forward, or simulate the effects of complex factors to synthesize
possible future scenarios to reveal anticipated and unanticipated (emergent) futures.
In each of these cases, we can view the analysis-synthesis process as an
evidence-decomposing and model-building process. The objective of this
181
process is to sort through and organize data (analyze) and then to assemble (synthesize), or marshal related evidence to create a hypothesisan instantiated
model that represents one feasible representation of the intelligence subject (target). The model is used to marshal evidence, evaluate logical argumentation,
and provide a tool for explanation of how the available evidence best fits the
analysts conclusion. The model also serves to help the analyst understand what
evidence is missing, what strong evidence supports the model, and where negative evidence might be expected. The terminology we use here can be clarified
by the following distinctions:
A real intelligence target is abstracted and represented by models.
A model has descriptive and stated attributes or properties.
A particular instance of a model, populated with evidence-derived and
182
Though not glamorous, modeling provides the rigor that enables deeper
(structured) and broader (collaborative) analysis: The model is an abstract representation that serves two functions:
1. Model as hypothesis. Based on partial data or conjecture alone, a model
may be instantiated as a feasible proposition to be assessed, a hypothesis. In a homicide investigation, each conjecture for who did it is a
hypothesis, and the associated model instance is a feasible explanation
for how they did it. The model provides a framework around which
data is assembled, a mechanism for examining feasibility, and a basis
for exploring data to confirm or refute the hypothesis. The model is
often viewed as an abstract representation of an intelligence target: an
organizational structure, a financial flow network, a military unit, a
corporation, a trajectory of a submarine, or a computer-aided design
(CAD) model of an adversarys weapon or a competitors product.
2. Model as explanation. As evidence (relevant data that fits into the
model) is assembled on the general model framework to form a
hypothesis, different views of the model provide more robust explanations of that hypothesis. Narrative (story), timeline, organization relationships, resources, and other views may be derived from a common
model. In a criminal investigation, the explanation seeks to prove the
case, without a doubta case that is both coherent (all elements of
the hypothesis are consistent with the evidence and are noncontradictory) and correspondent (all hypothesis expectations are consistent
with and not contradicted by evidence from the real world).
The process of implementing data decomposition (analysis) and model
construction-examination (synthesis) can be depicted in three process phases or
spaces of operation (Figure 5.6):
1. Data space. In this space, data (relevant and irrelevant, certain and
ambiguous) are indexed and accumulated. Indexing by time (of collection and arrival), source, content topic, and other factors is performed
to allow subsequent search and access across many dimensions.
183
Argumentation
(Abduction)
Problem
Decomposition
Data
Analysis
Data
Searches
Test, refine
hypotheses
Explanation
Timeline
Organize, structure
evidence
Synthesis
Composition
Relationship
Map
Imagery
Narrative
2. Argumentation space. The data is reviewed; selected elements of potentially relevant data (evidence) are correlated, grouped, and assembled
into feasible categories of explanations, forming a set (structure) of
high-level hypotheses to explain the observed data. This process
applies exhaustive searches of the data space, accepting some as relevant and discarding others. In this phase, patterns in the data are discovered, although all the data in the patterns may not be present; these
patterns lead to the creation of hypotheses even though all the data may
not exist. Examination of the data may lead to creation of hypotheses
by conjecture, even though no data supports the hypothesis at this
point. The hypotheses are examined to determine what data would be
required to reinforce or reject each; hypotheses are ranked in terms of
likelihood and needed data (to reinforce or refute). The models are
tested and various excursions are examined. This space is the court in
which the case is made for each hypothesis, and they are judged for
completeness, sufficiency, and feasibility. This examination can lead to
requests for additional data, refinements of the current hypotheses,
and creation of new hypotheses.
3. Explanation space. Different views of the hypothesis model provide
explanations that articulate the hypothesis and relate the supporting
evidence. The intelligence report can include a single model and
explanation that best fits the data (when data is adequate to assert the
184
single answer) or alternative competing models, as well as the supporting evidence for each and an assessment of the implications of
each. Figure 5.6 illustrates several of the views often used: timelines of
events, organization-relationship diagrams, annotated maps and
imagery, and narrative story lines.
The form of the hypothesis-model is a function of the problem being
addressed, and the model can have many views or perspectives of explanation. By
a hypothesis or explanation, we can refer to a set of views or models that represent
the single hypothesis. As in a criminal investigation, the model of a crime can be
viewed from many perspectives as the evidence is fitted to a comprehensive explanation (e.g., the timeline of events, the path of the suspect on a map, or the
spreadsheet of stolen assets matching evidence found in the suspects home).
Figure 5.7 illustrates several of the common forms of models, where each
may provide a different perspective on a subject of investigation: an entity, an
event, a process, or a target object. Robert Clark has enumerated and explained
practical analytic methods to quantify and synthesize descriptive and normative
models for a wide range of intelligence applications in Intelligence Analysis: Estimation and Prediction [20]. Consider the range of analytic modeling activities
that are required to answer the diverse questions posed by national and military
intelligence consumers:
What is the gross domestic product of a closed foreign regime? Economic
regarding the status of science and technology programs require a program schedule (timeline) model to be hypothesized, and milestones on
the schedule must be evaluated against observations (e.g., weapons testing or facilities construction) [21].
What is the air order of battle of a foreign nation? Order of battle ques-
185
Relationships and
Organizational structure within
human organizations
Invest
ments
9
9
8
8.5
7.5
6.5
7.5
7.5
9.5
9
9
7.5
Cash Flow
Invest ments
6
4
2
9/9
/01
9/2
/01
9/0
1
8/5
/01
Day
Cash Flow
29 -Jul
10
30 -Jul
10
31 -Jul
10
1 -Aug
10
2 -Aug
9
3 -Aug
9
4 -Aug
9
5 -Aug
9
6 -Aug
8
7 -Aug
8
8 -Aug
8
9 -Aug
8
10 -Aug
7
11 -Aug
7
12 -Aug
7
13 -Aug
7
14 -Aug
7
7/2
Spreadsheets
Graphs
Financial flows
Day No.
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
8/1
2/0
1
8/1
9/0
1
8/2
6/0
1
Financial
Estimate a resource
or budget
Break down financial
transactions and
components
Organization charts
Social networks
Org directories
Entity and
event
relationship
Entity-to-entity and
entity-to-event
linkages; noncausal
relationships
Data linkage
and network diagrams
Event linkage,
temporal
sequence,
operational
process flows
Gantt charts
Phase diagrams
Flow charts
Transaction sequences
Physical
objects
Physical structures,
facilities, sites, and lines
of communication
Annotated imagery
3D CAD models
Geophysical
areas, sites
Location of entities
and events in geospatial
context; cultural and
contextual features
Object
tracking
Kinematic behavior
of physical objects
(e.g., aircraft, ships, and
ground vehicles)
1
3
2
power source, and data links) and its operations (e.g., the personnel and
heat and air conditioning). The targeting analysis evaluates which contributing functions or operations may be attacked to cease military
functionality.
What are the intentions of a foreign leader? The challenge of estimating
the intentions, beliefs, and perceptions of human leaders (decision
makers) is among the most difficult, yet most important, tasks posed to
analysts. As noted by the U.S. Director of Central Intelligence, George
Tenet:
From the mid-1960s on to the Soviet collapse, we knew roughly how
many combat aircraft or warheads the Soviets had, and where. But why did
186
they need that many or that kind? What did they plan to do with them? To
this day, Intelligence is always much better at counting heads than divining
what is going on inside them. That is, we are very good at gauging the size
and location of militaries and weaponry. But for obvious reasons, we can
never be as good at figuring out what leaders will do with them. Leadership
analysis remains perhaps the most difficult of analytic specialties. Mikhail
Gorbachevs rise to power in the Soviet Unionassessing his evolving
thinking and policies, their implications and the chances for their successposed huge analytical dilemmas. It is tough to divine leadership
intentions in a secretive, centrally controlled societyparticularly if that
leadership, as was true under Gorbachev, ceases to be static. Assessing
thinking beyond the leadershipidentifying other societal forces at work
and weighing their impacts, is even tougher [22].
For a single target under investigation, we may create and consider (or
entertain) several candidate hypotheses, each with a complete set of model views.
If, for example, we are trying to determine the true operations of the foreign
company introduced earlier, TradeCo, we may hold several hypotheses:
1. H1The company is a legal clothing distributor, as advertised.
2. H2 The company is a legal clothing distributor, but company executives are diverting business funds for personal interests.
3. H3The company is a front operation to cover organized crime,
where hypothesis 3 has two subhypotheses:
H31The company is a front for drug trafficking.
H32The company is a front for terrorism money laundering.
In this case, H1, H2, H31, and H32 are the four root hypotheses, and the
analyst identifies the need to create an organizational model, an operations
flow-process model, and a financial model for each of the four hypothesescreating 4 3 = 12 models. The models help the analyst define the data needed to
distinguish the hypotheses; the organization structure, financial flow, and operations behaviors combined give insight into the character of the true business. In
practical application, three versions of the three basic model types are maintained, and evidence is fitted to the models to determine which hypothesis best
fits the data.
187
operations has brought a greater emphasis on intelligence targets that exist not
only in the physical domain, but in the realms of information (e.g., networked
computers and information processes) and human decision making [23]. Information operations (IO) are those actions taken to affect an adversarys information and information systems, while defending ones own information and
information systems [24]. The U.S. Joint Vision 2020 describes the Joint Chiefs
of Staff view of the ultimate purpose of IO as to facilitate and protect U.S.
decision-making processes, and in a conflict, degrade those of an adversary
[25]. The JV2020 builds on the earlier JV2010 [26] and retains the fundamental operational concepts, two with significant refinements that emphasize IO.
The first is the expansion of the vision to encompass the full range of operations
(nontraditional, asymmetric, unconventional ops), while retaining warfighting
as the primary focus. The second refinement moves information superiority
concepts beyond technology solutions that deliver information to the concept of
superiority in decision making. This means that IO will deliver increased information at all levels and increased choices for commanders. Conversely, it will
also reduce information to adversary commanders and diminish their decision
options. Core to these concepts and challenges is the notion that IO uniquely
requires the coordination of intelligence, targeting, and security in three fundamental realms, or domains of human activities [27]. These are likewise the three
fundamental domains of intelligence targets, and each must be modeled:
1. The physical domain encompasses the material world of mass and
energy. Military facilities, vehicles, aircraft, and personnel make up the
principal target objects of this domain. The orders of battle that measure military strength, for example, are determined by enumerating
objects of the physical world.
2. The abstract symbolic domain is the realm of information. Words,
numbers, and graphics all encode and represent the physical world,
storing and transmitting it in electronic formats, such as radio and TV
signals, the Internet, and newsprint. This is the domain that is expanding at unprecedented rates, as global ideas, communications, and
descriptions of the world are being represented in this domain. The
domain includes the cyberspace that has become the principal means by
which humans shape their perception of the world. It interfaces the
physical to the cognitive domains.
3. The cognitive domain is the realm of human thought. This is the ultimate locus of all information flows. The individual and collective
thoughts of government leaders and populations at large form this
realm. Perceptions, conceptions, mental models, and decisions are
formed in this cognitive realm. This is the ultimate target of our
188
adversaries: the realm where uncertainties, fears, panic, and terror can
coerce and influence our behavior.
These are not three arbitrary domains; even early philosophers have recognized them as the basic components of our knowledge. Aristotle, an empiricist
philosopher, identified these three domains in his Metaphysics, written in 350
B.C. He distinguished physical objects and the abstractions (ideas) that the
mind creates once the senses perceive the object. He further distinguished the
words that the mind creates to symbolize or represent the abstractions of the
mind. He described three processes of the intellect that manipulate these
domains:
1. Apprehension is the process by which the mind perceives and understands the sensed physical object and creates a mental abstraction.
(Physical-to-cognitive object mappings are formed.)
2. Predication is the process of making declarations or propositions about
the objectcharacterizing the object and its behavior. (Cognitive-tosymbolic mappings are created.)
3. Reasoning is the process, then, of applying logical principles to the
propositions to create new conclusions, or syllogisms. Here, Aristotle
recognized the methods of deduction and induction. (Symbolic logic
draws new conclusions about cognitive and physical objects.)
More recently, C.S. Peirce developed a mathematical theory of signs, or
semiotics, that also embraces the three fundamental domains [28]. More explicitly than Aristotle, Peirces logic distinguished a triad of relationships between
the physical object, the symbolic sign that represents it, and the cognitive
thought in the mind:
Indeed, representation necessarily involves a genuine triad. For it involves a
sign, or representamen, of some kind, inward or outward, mediating
between an object and an interpreting thought [29].
Physical
Vehicles
Facilities
Symbolic
Packets
Sessions
189
Cognitive
Mental states
Ideas
Phenomena
Domain
Laws of Physics
Sensors
EO.IR, SAR,
spectral sensors
States,
Features,
Complexities
Detection
Methods
Logoff
Physical
components
Sensor
perspective
Target
articulations
Environmental
signature variance
Signature pattern
matching; model
matching
Human cognition
Network routing,
stack, and protocols
No direct sensors
SIGINT, NETINT,
intrusion sensors
Data components
ISO layer of target
and form
Target
transformation
Net environment
signature variance
Data pattern
matching; model
based matching
Mental
components
Mental states
Cultural, cognitive
biases
Behavioral
phenomena
Behavior pattern
matching; cognitive
model matching
Figure 5.8 Representative targets, states, and observable phenomena in three domains.
190
5.6 Summary
Intelligence analysis and synthesis is inherently an evidence decomposition and
hypothesis assembly (or model-construction) process, where the model provides
the framework around which evidence is marshaled. This framework forms the
basis for structuring alternative hypotheses and supporting arguments to provide
answers to the questions of intelligence consumers. In this chapter, we have
developed the basic concepts of reasoning and approaches to explicitly model
intelligence topics and targets, as well as the hypotheses regarding their description or behavior. A collaborative intelligence analysis-synthesis process requires
such explicit modeling (versus unshared mental modelstacit target representations locked in the minds of an individual domain expert analysts). We have
shown that the analysis-synthesis process proceeds from intelligence analysis to
operations analysis and then to policy analysis. The knowledge-based intelligence enterprise requires the capture and explicit representation of such models
to permit collaboration among these three disciplines to achieve the greatest
effectiveness and sharing of intellectual capital.
191
Endnotes
[1] In this text, we use the terms data (a scientific term generally used to refer to quantitative
items) and evidence (a legal term generally used to refer to qualitative facts used to substantiate a case) to refer to all forms of known facts, raw measurements, observations, or
reports that provide the basis for analysis-synthesis. Evidence refers to data that is relevant
to a problem at hand.
[2] Richard, A. O., International Trafficking in Women to the United States: A Contemporary
Manifestation of Slavery and Organized Crime, U.S. Dept of State Bureau of Intelligence
and Research, DCI Center for the Study of Intelligence Monograph, Washington, D.C.:
CIA, November 1999.
[3] See, for example, Dulles, A., The Craft of Intelligence, New York: Harper and Row, 1963;
Dulles, Alan, The Role of Intelligence in Policy Making, Harvard Law School Forum
(audio), December 13, 1963; Davis, J., The Challenge of Managing Uncertainty: Paul
Wolfowitz on Intelligence-Policy Relations, CIA Studies in Intelligence, Vol. 39, No. 5,
1996; and Ford, Harold P., CIA and the Vietnam Policymakers: Three Episodes 1962-1968,
CIA Center for Studies in Intelligence, 1997.
[4] Cited in Davis, J., The Kent-Kendall Debate of 1949, CIA Studies in Intelligence, Vol.
36, No. 5, 1992, p. 93.
[5] Helgerson, J. L., CIA Briefings of Presidential Candidates, Center for the Study of Intelligence, May 22, 1996. Quoted in Chapter 7, Concluding Observations, section entitled
Keeping out of Politics.
[6] Bush, G. H. W., Remarks at the Dedication Ceremony for the George Bush, Center for
Intelligence, April 26, 1999.
[7] Figure adapted from Waltz, E., Fundamentals of Reasoning and Multisensing, Figure 5,
page 41, in Hyder, Shabazian, and E. Waltz (eds.), Multisensor Data Fusion, Dordrecht, the Netherlands: Kluwer, 2002.
[8] Barwise, J., and J. Etchemendy, Language, Proof, and Logic, New York: Seven Bridges
Press, 2000.
[9] Holland, J. H., et al., Induction: Processes of Inference, Learning and Discovery, Cambridge,
MA: MIT Press, 1986.
[10] Koestler, A., The Act of Creation, New York: Macmillan, 1964, pp. 105109. Also see his
more recent work: Koestler, A., Janus, Chapters 6 and 7, New York: Random House,
1978.
[11] Kuhn, T. S., The Structure of Scientific Revolutions, (3rd ed.), Chicago: University of Chicago Press, 1996.
[12] Polanyi, M., The Tacit Dimension, Garden City, NY: Doubleday, 1966.
192
[13] In practical intelligence analysis, efforts to both confirm and disconfirm are important
methods of comparing, evaluating, and selecting hypotheses. It is important to note, however, that the philosophy of science has hotly debated these methods, with some viewing
either one or the other as valid means of obtaining objective knowledge, but not both.
[14] Philosopher Karl Popper (19021994) applied the term falsification to the process of gaining certain knowledge by disconfirming conjectures. Popper rejected the traditional logic
of induction and confirmation as the basis for scientific discovery, asserting that certain
knowledge is gained only through falsification.
[15] Josephson, J. R., and S. G. Josephson (eds.), Abductive Inference, Cambridge, England:
Cambridge University Press, 1996.
[16] In the Silver Blaze episode, Sherlock Holmes realizes that the criminal was, in fact, the
owner of the farm because no one heard the dog bark during the commitment of the
crime. The nonoccurrence of the barking revealed the owner to be the only person that
could be present at night without causing the dog to bark.
[17] Commission to Assess the Ballistic Missile Threat to the United States, Side Letter to
the Rumsfeld Report, March 18, 1999. This unclassified letter was prepared subsequent to
the 1998 formal report to specifically articulate the commissions concerns about intelligence analysis processes.
[18] The coherence and correspondence theories of truth in epistemology are competing
approaches to objective truth; both hold valuable insights into basic principles for evaluating intelligence data. Here we apply the basic principles for illustration in practical analysis
but do not intend to apply the deeper philosophical implications of each theory.
[19] McCarthy, M., The Mission to Warn: Disaster Looms, Defense Intelligence Journal, Vol.
7, No. 2, 1998, p. 21.
[20] Clark, R. M., Intelligence Analysis: Estimation and Prediction, Baltimore: American Literary
Press, 1996.
[21] See, for example, the Executive Report of the Commission to Assess the Ballistic Missile Threat
to the United States, July 1998, accessed on-line on May 25, 2002, at
http://www.fas.org/irp/threat/bm-threat.htm; see also Foreign Missile Developments and the
Ballistic Missile Threat to the United States Through 2015, U.S. National Intelligence
Council, September 1999.
[22] Remarks of DCI George J. Tenet, Opening Remarks, The Conference on CIAs Analysis of the Soviet Union, 19471991, Princeton University, March 8, 2001, accessed online on October 30, 2001, at http://www.odci.gov/cia/public_affairs/speeches/
dci_speech_03082001.html. Also see Alberts, D. S. (et al.), Understanding Information Age
Warfare, Washington, D.C.: CCRP, September 2001, accessed on-line on October 30,
2002, at http://www.dodccrp.org/Publications/pdf/UIAW.pdf.
[23] This section is adapted from the authors paper: Waltz, E., Data Fusion in Offensive and
Defensive Information Operations, Proc. of National Symposium of Sensor and Data
Fusion, San Antonio, TX, June 2000, Vol. 1, pp. 219232. Veridian, used by
permission.
[24] IO definition from DoD Joint Publication JP 3-13.
193
[25] U.S. DoD Joint Chiefs of Staff J-5, Joint Vision 2020, Washington, D.C.: Government
Printing Office, May 24, 2000, p. 28.
[26] U.S. DoD Joint Chiefs of Staff, Joint Vision 2010, Washington, D.C.: Government Printing Office, 1977.
[27] This concept of describing intelligence targets in three domains of reality was first introduced in Waltz, E., Information Warfare Principles and Operations, Norwood MA: Artech
House, 1998, see Sections 1.2 and 5.2. For a more thorough discussion of this concept,
see Waltz et al., The Critical Role of Cognitive Models in Next Generation Intelligence
Architectures, in Proc. of 8th Annual AIPA Symp., Washington D.C., March 2324,
1998. Also see Alberts, D. S. (et al.), Understanding Information Age Warfare, Washington,
D.C.: CCRP, September 2001, accessed on-line on October 30, 2002 at http://www
.dodccrp.org/Publications/pdf/UIAW.pdf.
[28] The development of concepts of semiotics applied more generally to linguistics, and
human interpretation is attributed to Peirces contemporary, Swiss linguist Ferdinand de
Saussiere (18571913). These works are applicable to the problems of perception management by the use of signs (symbolic objects) to influence thought (cognitive objects).
[29] Peirce, C. S., C.P. 1-480The Logic of Mathematics, 1896. In a manuscript a year later,
Peirce further developed this triad, calling the cognitive object the interpretant: A sign, or
representamen, is something which stands to somebody for something in some respect or
capacity. It addresses somebody, that is, creates in the mind of that person an equivalent
sign or perhaps a more developed sign. That sign which it creates I call the interpretant of
the first sign. Peirce, C. S., C.P. 2-228Division of Signs, v. 1897.
[30] The term target is used throughout this chapter to refer to the object of attention, rather
than a specific target of offensive attack (though this is one function of IO). The mind of
an individual or a group is targeted as an object to be understood, modeled, and explained
by intelligence so actions can be taken. Actions to induce or coerce the mind of an individual or audience (group) also target the mind. For an early examination of this issue, see
Szfranski, R., Neocortical Warfare? The Acme of Skill, Military Review, November
1994, pp. 4155. The article is also available in Arquilla, J., and D. Ronfeldt (eds.), In
Athenas Camp: Preparing for Conflict in the Information Age, Santa Monica, CA: RAND,
1997.
[31] Since Operation Allied Force, the U.S. DoD has considered refining the broad definition
of IO to focus more narrowly on this cognitive aspect of IO, including perception management. See Verton, Dan, DoD Redefining Information Operations, Federal Computer
Week, May 29, 2000.
[32] Quotation by Captain Sir B. H. Liddell Hart in Thoughts on War (1944), cited in Joint
Doctrine for Information Operations, Joint Pub 3-13, October, 9, 1998, p. II-4. It is interesting to note that Liddell Hart observed that Sun Tzu had noted the same concept.
[33] It should be noted that both domains could be considered to be metaphysical, though classical philosophers would likely object. Both the cognitive domain and the symbolic
(entirely a product of human cognition, though represented in physical phenomena) are
abstract in nature, transcend physical science, and concern the mind.
6
The Practice of Intelligence Analysis and
Synthesis
Intelligence operations ranging in scale from small private-sector, competitive
intelligence cells to large national intelligence organizations must implement
similar process flows and address similar implementation considerations to integrate analysts with intelligence processes and tools. While the last chapter introduced the theoretical aspects of analysis and synthesis, this chapter addresses the
practical implementation considerations unique to intelligence organizations.
The chapter moves from high-level functional flow models toward the processes
implemented by analysts. In Chapter 7, we will describe the detailed functional
interactions between analysts and their automated KM systems.
While the last chapter dealt with intelligence analysis-synthesis from the
perspective of rational and logical reasoning processes, here we describe the
process from the perspective of the intelligence consumer and the implementers
of enterprises of people, processes, and technologies to conduct analysissynthesis. A practical description of the process by one author summarizes the
perspective of the intelligence user:
A typical intelligence production consists of all or part of three main elements: descriptions of the situation or event with an eye to identifying its
essential characteristics; explanation of the causes of a development as well
as its significance and implications; and the prediction of future developments. Each element contains one or both of these components: data, provided by knowledge and incoming information and assessment, or
judgment, which attempts to fill the gaps in the data [1].
195
196
ages or bettors odds, where feasible, and avoid overstating the certainty
of judgments (note: bettors odds state the chance as, for example, one
out of three);
[G2]: identify explicitly its assumptions and judgments;
[G3]: develop and explore alternative futures: less likely (but not impos-
197
The Commission would urge that the [IC] adopt as a standard of its methodology that in addition to considering what they know, analysts consider
as well what they know they dont know about a program and set about filling gaps in their knowledge by:
[R1] taking into account not only the output measures of a program, but the
input measures of technology, expertise and personnel from both internal
sources and as a result of foreign assistance. The type and rate of foreign assistance can be a key indicator of both the pace and objective of a program into
which the IC otherwise has little insight.
[R2] comparing what takes place in one country with what is taking place in
others, particularly among the emerging ballistic missile powers. While each
may be pursuing a somewhat different development program, all of them are
pursuing programs fundamentally different from those pursued by the US,
Russia and even China. A more systematic use of comparative methodologies
might help to fill the information gaps.
[R3] employing the technique of alternative hypotheses. This technique can
help make sense of known events and serve as a way to identify and organize
indicators relative to a programs motivation, purpose, pace and direction. By
hypothesizing alternative scenarios a more adequate set of indicators and collection priorities can be established. As the indicators begin to align with the
known facts, the importance of the information gaps is reduced and the likely
outcomes projected with greater confidence. The result is the possibility for
earlier warning than if analysts wait for proof of a capability in the form of hard
evidence of a test or a deployment. Hypothesis testing can provide a guide to
what characteristics to pursue, and a cue to collection sensors as well.
[R4] explicitly tasking collection assets to gather information that would disprove a hypothesis or fill a particular gap in a list of indicators. This can prove a
wasteful use of scarce assets if not done in a rigorous fashion. But moving from
the highly ambiguous absence of evidence to the collection of specific evidence
of absence can be as important as finding the actual evidence [3].
The two reports cover the spectrum of intelligence issues, providing excellent guidelines for analysis. The GAO report addressed NIEs that produce
broad conceptual estimates (e.g., nation-state capabilities and global threat
assessments and projections) while the Rumsfield report addressed more focused
hard-target problems where data is scarce and the subjects employ denial and
deception measures. The essence of these nine recommendations can be summarized (Table 6.1) to reveal what kind of rigor is expected by policymakers.
Notice that intelligence consumers want more than estimates or judgments; they expect concise explanations of the evidence and reasoning processes
behind judgments with substantiation that multiple perspectives, hypotheses,
and consequences have been objectively considered. They expect a depth of
198
Process
Standard
Reference
Tasking
[R3, R4]
Analysis-synthesis
[G1]
[R1]
[R3], [G3]
[G4]
[R2]
[G5]
[G1]
analysis-synthesis that explicitly distinguishes assumptions, evidence, alternatives, and consequenceswith a means of quantifying each contribution to the
outcomes (judgments). To meet these expectations, the analysis-synthesis
process must be structured, explicit, and thorough. The intelligence tradecraft
best practices described in Chapter 4 were produced to provide just such structure for analysis [4], and to provide the rigor required by national intelligence
officers [5].
In the following sections, we address the practical procedures to implement this kind of structure.
199
Several abstract models have been developed to describe the details of the
process, each with a different perspective and focus (Figure 6.1) [6]. The figure
is organized with increasing levels of model granularity moving down the chart.
The first two models focus on command and control decision making for military action, while the second two models are focused on the delivery of intelligence. The models are all cyclic, including the feedback from results to actions
that include sensor tasking to better observe a situation, or military response to
change a situation.
The stimulus-hypothesis-option-response (SHOR) model, described by Joseph
Wohl in 1986, emphasizes the consideration of multiple perception hypotheses
to explain sensed data and assess options for response. The model detailed the
considerations for commander decision making by making choices among alternative course of action [7]. The observe-orient-decide-act (OODA) loop, developed by Col. John Warden, is a high-level abstraction of the military command
and control loop that considers the human decision-making role and its dependence on observation and orientationthe process of placing the observations in
perceptual framework for decision making [8]. While the OODA model applies
to the entire command and control process (in which intelligence provides the
observe function), the entire loop may be applied to the intelligence control loop
in which the act function governs tasking and collection. Both of these models
focus on the military situation as the object of control; the next two models view
Intelligence
Models
Plan
Task
Collect
Response
SHOR
Process
Stimulus
Hypothesis
option
Observe
Orient
perspective
framework
Situation
OODA
Decide
Act
Situation
TPED
Decision
makers
Task
Analysis
Collect
Process
Exploit
Disseminate
Disseminate
Situation
JDL
Data
Fusion
Level 4
+ Resource
Info Refinement
needs
Collect
Situation
L0 Signal
Refine
L1 Object
Refine
L2
L3
Common
operating
picture
200
the situation as an object of surveillance, where the control loop serves to better
observe and understand the situation.
The tasking, processing, exploitation, dissemination (TPED) model used by
U.S. technical collectors and processors [e.g., the U.S. National Reconnaissance
Office (NRO), the National Imagery and Mapping Agency (NIMA), and the
National Security Agency (NSA)] distinguishes between the processing elements
of the national technical-means intelligence channels (SIGINT, IMINT, and
MASINT) and the all-source analytic exploitation roles of the CIA and DIA.
The TPED process has been applied to independent stovepipe intelligence
channels, and concepts have been developed to implement wide-scale multiINT TPED processes [9]. The model is a high-level organizational model that
does not include planning per se because it includes policy-level activities organizationally above the processing chain.
The DoD Joint Directors of Laboratories (JDL) data fusion model is a
more detailed technical model that considers the use of multiple sources to produce a common operating picture of individual objects, situations (the aggregate
of objects and their behaviors), and the consequences or impact of those situations. The model includes a hierarchy of data correlation and combination
processes at three levels (level 0: signal refinement; level 1: object refinement;
level 2: situation refinement; level 3: impact refinement) and a corresponding
feedback control process (level 4: process refinement) [10]. The JDL model is a
functional representation that accommodates automated processes and human
processes and provides detail within both the processing and analysis steps. The
model is well suited to organize the structure of automated processing stages for
technical sensors (e.g., imagery, signals, and radar).
The practical implementation of the processing and analysis stages in a
typical intelligence workflow can be described using the JDL model to distinguish the characteristics of each stage (Figure 6.2). The processing stage is characterized by high-volume single-INT processing channels (stovepipes) to
perform the JDL data fusion model level 0 and 1 functions:
Level 0: signal refinement automated processing correlates and combines
raw signals (e.g., imagery pixels or radar signals intercepted from multiple locations) to detect objects and derive their location, dynamics, or
identity.
Level 1: object refinement processing detects individual objects and correlates and combines these objects across multiple sources to further
refine location, dynamics, or identity information.
These processing stages may also include cross-INT cueing to enable the
detection of objects in one channel to cue the processing for confirming data or
201
IMINT
Levels 0,1
Levels 0,1
Typical
processing
workflow
SIGINT
Levels 0,1
Levels 0,1
HUMINT
Level
2
Level
2
Process
perspective
Information
effects
Level
3
Impact
analysis
Filter, index
Processing
Functions
Level
2
All-source
situation
analysis
Analysis
On-line
High-volume, fully automated
Service
services operating on intelligence
and
architecture streams and large databases;
stove piped parallel processes
Marshalling evidencemove
rom target object hypotheses
back into data
Expansion (creation)
of knowledge
Off-line
Manually initiated; highly
interactive; many tool services
on collaborative networks
data to resolve object identity in other channels. The stage may also correlate
and combine this data across channels to perform limited level 2 situation assessments (e.g., correlation and identification of a cluster of tanks as an armored
unit on the move toward a likely target). This stage may be implemented as an
integration of high-volume processing and analysts (e.g., an IMINT chain of
image processors and imagery analysts who enter images and linked textual
analysis reports into an IMINT database for subsequent analysis). In this case,
the processing chain includes processing and single-INT analysis by specialists.
202
The output of this stage is a set of heterogeneous databases (e.g., imagery, video,
text, or audio) or a data warehouse for subsequent all-source analysis.
The analysis stage in the figure performs the analysis-synthesis functions
described in Chapter 5 for higher level understanding of situations and their
consequences:
Level 2: situation refinement analysis correlates and combines the
Figure 6.2 illustrates the general contrast in the processing and analysis
stages; the processing stage is on-line, processing near-real-time, high-volume
single-INT data channels while the all-source analysis stage is off-line, focused
on selecting only the required data to solve consumer problems. The processing
stage is data driven, processing data as it is collected to produce intermediate
products for large databases, while the analysis stage is goal driven, responding
to queries for intelligence answers from consumers (e.g., targeting, I&W, or
order of battle or national capability estimates). The analysis stage employs
semiautomated detection and discovery tools to access the data in large databases produced by the processing stage. In general, the processing stage can be
viewed as a factory of processors, while the analysis stage is a lower volume shop
staffed by craftsmenthe analytic team.
The level 4 process refinement flows are not shown in the figure, though
all forward processing levels can provide inputs to refine the process to: focus
collection or processing on high-value targets, refine processing parameters to filter unwanted content, adjust database indexing of intermediate data, or improve
overall efficiency of the production process. The level 4 process effectively performs the KM business intelligence functions introduced in Section 3.7.
The practical implementation of this workflow, whether in a large national
or military intelligence organization or in a small corporate competitive intelligence cell, requires a structural model of the workflow processes, policies, and
procedures that move from raw data to finished intelligence products. Later, in
Chapter 9, we illustrate the process to translate such a workflow into an enterprise functional design. The following sections focus on the critical role of the
human analyst and integration with the automation components of the
enterprise.
203
Case-based
reasoning
Small
Classical
artificial
intelligence
Qualitative
reasoning
Templates,
search,
logic
Quantitative
reasoning
Small
Analogical
reasoning
Connectionist
approaches
Many
2
Figure 6.3 Three categories of intelligence problems and automated reasoning technology
solutions [12].
204
problem space includes nine general areas of automation technologies, with the
general intelligence problem categories overlaid (See Table 6.2 for example
intelligence problems that correspond to the three categories in Figure 6.3). The
problem categories move in increasing situation complexity from lower left
(category 1) to upper right (category 3) and can be generally related to the levels
of the JDL data fusion model. The simplest level 0 and 1 problems (lower left
corner) with few causal factors and small linear effects can be solved with simple
template matching (matched filter detection), applying search processes to
exhaustively search for matching patterns and logic to match the current situation with prior patterns. These problems include the detection of straightforward objects in images, content patterns in text, and emitted signal matching.
More difficult problems still in this category include dynamic situations with
moderately higher numbers of actors and scales of effects that require qualitative
(propositional logic) or quantitative (statistical modeling) reasoning processes.
These include those problems where small dimension deterministic or stochastic
models can accurately represent the situations for comparison with collected
data, such as the kinematic tracking of physical targets with statistical Kalman
filter models.
The middle band of more complicated (category 2) problems addresses
higher dimensional and highly dynamic situations; automated processes resort
to higher order reasoning. Approaches to deal with large numbers of actors with
small and moderate scales of effects apply connectionist solutions (e.g., Bayesian
and neural networks) and reasoning by analogy. Where smaller numbers of
Table 6.2
Representative Intelligence Problem Categories
Problem Category
2. Complex patterns and dynamic Relationship and novelty discovery in large databases
behavioral recognition
Military order of battle and operations analysis
Contextual pattern recognition in multimedia
Financial transactional analysis
3. Complex situation recognition
and prediction
Leadership analysis
Foreign political, social, and economic analysis
Foreign covert missile program analysis
Regional nation-state analysis
Global futures alternatives analysis
205
actors are involved but the scale of effects are greater, case-based reasoning is
applied to compare the present situation with stored cases. Classical heuristic
expert systems solve situations in the middle regions, but they are limited as the
situations exhibit nonlinearity and emergent behaviors that exceed the representations of knowledge in limited heuristic models and create intractable search
demands on the system.
The most difficult category 3 problems, intractable to fully automated
analysis, are those complex situations characterized by high numbers of actors
with large-scale interactions that give rise to emergent behaviors. Supportive
simulation tools, described in the next chapter, can provide support to analysts
for tackling these kinds of problems.
The implementation of these automated processes to support knowledge
externalization-internalization and combination are described in Chapters 7
and 8, respectively.
Knowledge
Information
Data
Sensemaking processes
Tacit knowledge
206
207
New stimulus
Mental model
Knowledge
(intelligence
analysissynthesis)
Model
Simulation
Data
(collect and
compile)
Reasonplace in
logical framework
comprehend and
assess situation
Place in emotional
framework, assess
feelings, reduce
alternative space
Judgmentabout
consequences of
alternative actions,
construct meaing
Decision making
and acceptance
of outcome
Cognitive processes Emotional processes
Mind
Thoughts
Body
Somatic markers
208
Analysis, Heuer has identified the major biases we exhibit when evaluating evidence, attributing causality in explaining relationships, and in estimating relative probabilities (Table 6.3) [16]. To these biases in analytic reasoning,
researcher Thomas Gilovich has added the bane of the common subjective
biases of motivation: believing self-serving beliefs (e.g., those the intelligence
chief believes), imagined agreement (exaggerating the agreement of other analysts and colleagues), and inaccurate narratives (distortion of tacit knowledge
in narrative stories) [17]. In each case, the analyst is inclined to create mental
models that distort perception of the significance of evidence or derived inferences, then attribute undeserved support for those models. In Combatting
Table 6.3
Cognitive Shortcomings to be Addressed by the Analyst
Area of Bias
Evaluating Evidence
Attributing Causality
Estimating
Probabilities
Succumbing to
Social Factors
209
Mind-Set, respected analyst Jack Davis has noted that analysts must recognize
the subtle influence of mindset, the cumulative mental model that distills analysts beliefs about a complex subject and find[s] strategies that simultaneously
harness its impressive energy and limit[s] the potential damage [18].
Davis recommends two complementary strategies:
1. Enhancing mindset. Creating explicit representation of the mindsetexternalizing the mental modelallows broader collaboration,
evaluation from multiple perspectives, and discovery of subtle biases.
2. Ensuring mind-set. Maintaining multiple explicit explanations and
projections and opportunity analyses provides insurance against
single-point judgments and prepares the analyst to switch to alternatives when discontinuities occur.
While these shortcomings address the problem of understanding the subject of an analysis, Davis has also cautioned analysts to beware the paradox of
expertise phenomenon that can distract attention from the purpose of an analysis.
This error occurs when discordant evidence is present and subject experts tend
to be distracted and focus on situation analysis (solving the discordance to
understand the subject situation) rather than addressing the impact on the
analysis of the consequences of the discrepancy. In such cases, the analyst must
focus on providing value added by addressing what action alternatives exist for
alternatives and their consequences in cost-benefit terms [19].
Heuer emphasized the importance of supporting tools and techniques to
overcome natural analytic limitations [20]: Weaknesses and biases inherent in
human thinking processes can be demonstrated through carefully designed
experiments. They can be alleviated by conscious application of tools and techniques that should be in the analytical tradecraft toolkit of all intelligence analysts. These tools and techniques support the kind of critical thinking
introduced in earlier chapters; the practical methods for marshaling evidence,
structuring argumentation, and evaluating hypotheses are introduced in the
next section.
210
Structuring Hypotheses
To illustrate the structure of hypotheses in practical intelligence problems, consider the intelligence conclusion in a critical 1964 U.S. CIA intelligence report
estimating the likelihood of the location and timing of Chinas first nuclear test.
The report, written in August 1964, concluded:
On the basis of new overhead photography, we are now convinced that the
previously suspect facility at Lop Nor in Western China is a nuclear test site
that could be ready for use in about two months. On the other hand the
weight of available evidence indicates the Chinese will not have sufficient fissionable material for a test of a nuclear device in the next few months. Thus,
the evidence does not permit a very confident estimate of the chances of a
Chinese Communist nuclear detonation in the next few months. Clearly the
possibility of such a detonation before the end of this year cannot be ruled
outthe test may occur during this period. On balance, however, we believe
that it will not occur until sometime after the end of 1964 [23].
211
Existence
H1
Lop Nor IS a nuclear
test site
H0
Lop Nor NOT a
nuclear test site
Status
H11
Ready for a test
within 2 months
H12
Not ready for test
within 2 months
material to conclude that a test can occur within 2 months, but a test
within the next 4 months cannot be ruled out.
Hypothesis H12the likelihood of a test increases beyond 2 months
There exist a number of classical approaches to representing hypotheses, marshaling evidence to them, and arguing for their validity. Argumentation structures propositions to move from premises to conclusions. Three perspectives or
disciplines of thought have developed the most fundamental approaches to this
process (Table 6.4):
1. Rhetoric has historically contributed to the disciplined structuring of
informal oral or written arguments to provide accuracy of thought,
clarity of communication, and strength of persuasion. Aristotle
emphasized three modes of persuasive appeal (proof): logos appeals to
212
Approach
Structured inferential
argumentation (informal
logic)
Formal logic
Intelligence
Application
All-source analysis
(across multiple
unstructured sources)
Natural language explanation of analytic results
Fuzzy logic
Inferential networks to
implement automated
data fusion
Database evidence
cross-correlation and
linking
Mathematical Bayesian
statistics
inference
DempsterShafer
evidential
reasoning
213
3. Mathematics has contributed probabilistic methods to describe uncertainty and quantitatively perform the inference process. These methods impose greater structure on both evidence and hypothesis and
provide a quantified method of reasoning that can be automated, presuming evidence and belief can be quantified. (Applications of these
automated methods are described in Chapter 8.)
Each discipline has contributed methods to represent knowledge and to
provide a structure for reasoning to infer from data to relevant evidence,
through intermediate hypotheses to conclusion. The term knowledge representation refers to the structure used to represent data and show its relevance as evidence, the representation of rules of inference, and the asserted conclusions. In
the following paragraphs we survey these approaches and their contributions to
the analysis-synthesis process.
6.6.3
214
Fire
control
Golf
C2
Freq F
Is linked
to a C2
center
SAM
battery
Missile
canisters
Linked
Convergent
Serial
Hotel
C2
Divergent
215
4. Rebuttals (R) are any conditions that may refute the claim.
5. Warrants (W) are the implicit propositions (rules, principles) that permit inference from data to claim.
6. Backing (B) are assurances that provide authority and currency to the
warrants.
Applying Toulmins argumentation scheme requires the analyst to distinguish each of the six elements of argument and to fit them into a standard structure of reasoningsee Figure 6.8(a)which leads from datum (D) to claim
(C). The scheme separates the domain-independent structure from the warrants
and backing, which are dependent upon the field in which we are working (e.g.,
legal cases, logical arguments, or morals).
The general structure, described in natural language then proceeds from
datum (D) to claim (I) as follows:
The datum (D), supported by the warrant (W), which is founded upon
the backing (B), leads directly to the claim (C), qualified to the degree
(Q), with the caveat that rebuttal (R) is present.
In Figure 6.8(b), we insert the elements of the Chinese nuclear test argument (used earlier in this section) into the Toulmin schema to illustrate how the
schema forces structure to the analysts argument. Such a structure requires the
analyst to identify all of the key components of the argumentand explicitly
report if any components are missing (e.g., if rebuttals or contradicting evidence
is not existent).
The benefits of this scheme are the potential for the use of automation to
aid analysts in the acquisition, examination, and evaluation of natural-language
arguments. As an organizing tool, the Toulmin scheme distinguishes data (evidence) from the warrants (the universal premises of logic) and their backing (the
basis for those premises). Notice that in the previous informal logic example,
data (the radar at location A emits at a high PRF) and warrants (so there must be
a SAM battery located there) were not distinguished; warrants and data are
equally treated as premises. It must be noted that formal logicians have criticized
Toulmins scheme due to its lack of logical rigor and ability to address probabilistic arguments. Yet, it has contributed greater insight and formality to developing structured natural-language argumentation.
6.6.4
Inferential Networks
216
Domainindependent
argument
structure
Data (D)
Since,
Warrant (W)
Fields
of the
arguments
(a)
Unless,
Rebuttal (R)
On account of
Backing
(B)
Overhead imagery
indicates:
Evidence of large test
area and certain facilities
at Lop Nor
Evidence of special test
activities underway
Since
Nuclear testing requires
large test area, certain
facilities, and special
preparatory test
activities
So,
presumably,
Unless
Chinese have
sufficient fissionable
material available
that we cannot
observe
Chinese will
not conduct a
nuclear test
within this year
(b)
Figure 6.8 (a) Toulmins argument structure, and (b) populated argument structure example.
217
structure complex arguments. We illustrate these networks in the following discussion using the directed acyclic graph forms introduced by Schum in his foundational work, Evidence and Inference for the Intelligence Analyst [25], and
his subsequent exhaustive text, The Evidential Foundations for Probabilistic
Reasoning [26].
The use of graph theory to describe complex arguments allows the analyst
to represent two crucial aspects of an argument:
Argument structure. The directed graph represents evidence (E), events,
or intermediate hypotheses inferred by the evidence (i), and the ultimate, or final, hypotheses (H) as graph nodes. The graph is directed
because the lines connecting nodes include a single arrow indicating the
single direction of inference. The lines move from a source element of
evidence (E) through a series of inferences (i1, i2, i3, in) toward a terminal hypothesis (H). The graph is acyclic because the directions of all
arrows move from evidence, through intermediate inferences to
hypothesis, but not back again: there are no closed-loop cycles.
Force of evidence and propagation. In common terms we refer the force,
strength, or weight of evidence to describe the relative degree of contribution of evidence to support an intermediate inference (in), or the ultimate hypothesis (H). The graph structure provides a means of
describing supporting and refuting evidence, and, if evidence is quantified (e.g., probabilities, fuzzy variables, or other belief functions), a
means of propagating the accumulated weight of evidence in an
argument.
Like a vector, evidence includes a direction (toward certain hypotheses)
and a magnitude (the inferential force). The basic categories of argument can be
structured to describe four basic categories of evidence combination (illustrated
in Figure 6.9):
1. Direct. The most basic serial chain of inference moves from evidence
(E) that the event E occurred, to the inference (i1) that E did in fact
occur. This inference expresses belief in the evidence (i.e., belief in the
veracity and objectivity of human testimony). The chain may go on
serially to further inferences because of the belief in E.
2. Consonance. Multiple items of evidence may be synergistic resulting in
one item enhancing the force of another; their joint contribution provides more inferential force than their individual contributions. Two
items of evidence may provide collaborative consonance; the figure
illustrates the case where ancillary evidence (E2) is favorable to the
218
Evidence form
Inferential structure
Inferential effect
Evidence E of an event directly
infers the intermediate
hypothesis that the event
occurred (i1)
hypotheses (i1).
in
i1
Direct
i1
Corroborative
E2
E1
Consonant
i3
Convergent
i1
Enhance
E1
i2
E2
i1
Corroborative
E1
E2
Redundant
i3
Cumulative
i1
Diminish
E1
i2
E2
i1
Contradictive
E1
Dissonant
E2
i3
Conflicting
i1
i2
E1
E2
219
220
221
Final hypothesis
i21
Relevance inferences
i8
Enhance
i14
i20
Consonant
i7
i19
i13
Evidence
Credibility inferences
i6
i2
E7
i12
i5
Conflict
Conflict
i11
i18
E8
i4
i1
E1
Diminish
i9
i3
E2
i10
E9
i17
i16
i15
Redundant
E3
E4
E5
E6
Figure 6.10 A simple inference network for the example military deception hypothesis.
222
223
Final hypothesis
The final hypothesis H = {H0, H1} weights the inferential force from the three
chains. Consonant inferences i14 and i20 from chains 2 and 3 (i14 enhances i20)
lead to i21, that the factory is a military vehicle garrison (military personnel are
conducting operations where military vehicles are stored). This inference and i8
(the company is conducting denial and deception) provide the combined inferential force for H. If the accumulated evidential force is sufficient, the analyst
makes the judgment H1 that the former factory provides cover, concealment,
and deception (CCD) for a military garrison.
Some may wonder why such rigor is employed for such a simple argument. This relatively simple example illustrates the level of inferential detail
required to formally model even the simplest of arguments. It also illustrates the
real problem faced by the analyst in dealing with the nuances of redundant and
conflicting evidence. Most significantly, the example illustrates the degree of
care required to accurately represent arguments to permit machine-automated
reasoning about all-source analytic problems.
We can see how this simple model demands the explicit representation of
often-hidden assumptions, every item of evidence, the entire sequence of inferences, and the structure of relationships that leads to our conclusion that H1 is
true.
Inferential networks provide a logical structure upon which quantified calculations may be performed to compute values of inferential force of evidence
and the combined contribution of all evidence toward the final hypothesis. In
these cases, evidence, intermediate inferences, and hypotheses (E, i, H) are
expressed as random variables using probabilities or other expressions to represent the inferential force. The most common approaches to apply quantitative
measures of uncertainty to evidence and to compute the inferential combination
of uncertain evidence are summarized in Table 6.5. In addition to Schum [26],
standard texts on multisensor data fusion and reasoning in uncertainty develop
the mathematics of these approaches [27].
224
Inferential Methodology
Evidence is represented in terms of prior and
conditional probabilities
Inference
Computation
Bayes Rule
Fuzzy algebra
Dempsters
Rule of Combination
analyst focuses on a single, most-likely explanation or estimate. The simultaneous evaluation of multiple, competing hypotheses entails far greater
cognitive strain than examining a single, most-likely hypothesis [28].
225
226
3. Matrix synthesis
Hypotheses
Intrinsic D&D
value potential Hypothesis Hypothesis Hypothesis Hypothesis
A1
A2
B
C
Evidence
SIGINT 022
3
3
+(0.80)
+(0.80)
Timeline A
8
3
+ (0.75) ++ (0.75)
-- (0.50)
IMINT 034
7
1
+ (0.75)
+ (0.95)
9
0
- (0.50)
Data MASINT K078
HUMINT case 2312
6
4
+ (0.60)
3
8
+ (0.50)
HUMINT case 0005
Press report 5-23-03
3
9
+ (0.75)
8
1
+ (0.85)
IMINT 037-22
Legend (+) = Evidence supports the hypothesis.
Legend ( - ) = Evidence refutes the hypothesis.
Legend (0.75) = Probability evidence would be seen if hypothesis is true.
4. Matrix analysis
5. Matrix
refine
6. Hypothesis analysis
7. Decision synthesis
227
automatically clusters related data sets by identifying linkages (relationships) across the different data types. These linked clusters are visualized
using link-clustering tools used to visualize clusters and linkages to
allow the analyst to consider the meaningfulness of data links and discover potentially relevant relationships in the real world.
Conceptualize. The linked data is translated from the abstract relationship space to diagrams in the temporal and spatial domains to assess
Table 6.6
Typical Criminal Database Categories
Data Category
Entities
Events
Organizational
Financial
Communications
Travel
228
1.
Alias P.A.N.M
P. H. Pitharny
Louis Handsome
Jp. Donarthy
.
Organize
Social
Network
2.
Muhamar Hanjadar
B.B. Blubingier
Khar Jharti
Leonard P. Wilson
Hanjai Jonaby
Alias Jones
People
Gardai Hunini
33-34897654
Bank
Accounts
02-4567-998
VISA 3452761555
00-456143260
MC 3425667398
Communications
Phone
Records
45-67983211
VISA 34232245678
MC 3425667398
01-555-786-0987
01-555-343-6789
06-555-765-9844
01-555-233-6543
01-888-765-9876
01-555-564-9876
01-555-567-9876
01-342-768-4678
01-987-222-0900
Time
Conceptualize
MC 674223421
01-555-665-9876
01-888-567-9876
01-456-567-9876
01-555-673-9876
Finances
6667983211
Space
1
Structure meaningful
relationships of entities
and events in time and
space.
evidence to create
complete arguments
Compare the relative
evidence and inferential
strength across
alternative hypotheses
Final
hypothesis
i8
Relevance inferences
Hypothesize
Credibility
Evidence inferences
3.
Enhance
i14
Hypotheses
Evidence A B C D E F
i20
Consonant
i7
i19
i13
i6
E7
i2
Conflict
i1
E1
i12
i5
i9
i3
E2
i18
i11 E8
i4
i10 i15
i17 E9
Diminish
i16
Redundant
E3
E4
Hypothesis tree
E5
E6
Alternative hypotheses
229
230
Objective
Target
Audience
Propaganda (PSYOP)
White
Leadership
Deception
Gray
Black
Intelligence Denial
Deception (OPSEC)
Population at large
Use
Deception
declared
Methods
sources and
and
organizaObjectives
tions to
influence
target
audiences
to accept
general
beliefs
Use
undeclared
sources and
organizations to
influence
target
audiences
to accept
general
beliefs
Use false
sources
and organizations to
influence
target
audiences to
accept
general
beliefs
Deny
access to
information
about intent
and
capabilities
National or
military
leadership
Intelligence collectors or
analysts
Use
diplomatic
channels and
sympathetic
influences to
induce beliefs
Deceive and
defeat
human and
technical
collectors
Minimize
the
signature of
entities and
activities
Use open
news sources
and channels
to induce
beliefs
source of the information, gray propaganda uses undeclared sources. Black propaganda purports to originate from a source other its actual sponsor, protecting the
true source (e.g., clandestine radio and Internet broadcast, independent organizations, or agents of influence [33]). Coordinated white, gray, and black propaganda efforts were strategically conducted by the Soviet Union throughout the
Cold War as active measures of disinformation:
for the Soviet Union, active measures constitute a dynamic and integrated array of overt and covert techniques for influencing events and
behavior in, and the actions of, foreign countries. These measures are
employed to influence the policies of other governments, undermine confidence in the leaders and institutions of these states, disrupt the relations
between various nations, and discredit and weaken major opponents. This
231
frequently involves attempts to deceive the target, and to distort the targets
perception of reality [34].
Denial operations by means of OPSEC seek to deny access to true intentions and capabilities by minimizing the signatures of entities and activities.
The cognitive shortcomings noted in the prior section can contribute to
self deception on the part of the analyst. Earnest Mays summarized the three
basic vulnerabilities of the intelligence analysts:
232
[Analysts] are vulnerable in the first place because they follow an almost
unavoidable rule of trying to fit the evidence they have into some coherent,
rational whole. They are vulnerable in the second place because, partly
perhaps from awareness of the power of prejudice and preconception, they
have a preference for data that is quantifiable and therefore appear comparatively objective. And thirdly they are vulnerable to deception
because, after having to judge hard issues, they are prone to look for confirming rather than disconfirming evidence [38].
Table 6.8
Countermeasures to D&D
Deceivers Objective
Collection
Analysis
Dissimulationdeny the
observation and detection of true
target phenomena
Reinforcementreinforce analysts
mental sets and expectations;
condition the analyst to reduce
sensitivity to changes and target
phenomena
Counterdeception
Measures
233
234
A A
B B
C
(D&D)
C (D&D)
D
(D&D)
D (D&D)
Intelligence
collection
Analysis
andand
synthesis
Synthesis
of of
alternative
Alternative
competing
Competing
hypotheses
Hypotheses
Focused D&D
hypothesis tasking
Incongruity
Testing
testing
Re Reconstructive
constructive Sprignals
inference
Inference
Past
PastPatterns
patterns
of D&D
of D&D
and
And
Capabilities
capabilities
Estimate
Estimate
Of
of adversarial
Adversarial
D&D
plans
Counter
Counter
D&D
D&D
Tasking
tasking
Intelligence
Intelligence
Vulnerability
vulnerability
Assessment
assessment
been described by Harris in a study on countering D&D and includes two active
searches for evidence to support, refute, or refine the D&D hypotheses [44]:
1. Reconstructive inference. This deductive process seeks to detect the presence of spurious signals (Harris call these sprignals) that are indicators
of D&Dthe faint evidence predicted by conjectured D&D plans.
Such sprignals can be strong evidence confirming hypothesis A (the
simulation), weak contradictory evidence of hypothesis C (leakage
from the adversarys dissimulation effort), or missing evidence that
should be present if hypothesis A were true.
2. Incongruity testing. This process searches for inconsistencies in the
data and inductively generates alternative explanations that attribute
the incongruities to D&D (i.e., D&D explains the incongruity of evidence for more than one reality in simultaneous existence).
These processes should be a part of any rigorous alternative hypothesis
process, developing evidence for potential D&D hypotheses while refining the
estimate of the adversaries D&D intents, plans, and capabilities. The processes
also focus attention on special collection tasking to support, refute, or refine current D&D hypotheses being entertained.
235
6.9 Summary
Central to the intelligence cycle, analysis-synthesis requires the integration of
human skills and automation to provide description, explanation, and prediction with explicit and quantified judgments that include alternatives, missing
evidence, and dissenting views carefully explained. The challenge of discovering
the hidden, forecasting the future, and warning of the unexpected cannot be
performed with infallibility, yet expectations remain high for the analytic community. The U.S. director of central intelligence (DCI) has described these
expectations:
What, then, if not infallibility, should our national leaders, and ultimately
the American public, expect of our analysts?
First and foremost, they should expect our analysts to deliver intelligence
that is objective, pulls no punches, and is free from political taint.
Next, they should expect that our analysts think creatively, constantly
challenging the conventional wisdom and tapping expertise wherever it
liesinside the IC or in the private sector and academia.
They should expect that our analysts always act with the highest standards of professionalism.
They should expect that they take risksanalytic risksand make the
tough calls when it would be easier to waffle.
They should expect that they respond to the Presidents and other decision makers needs on demandjuggling analytic priorities and capabilities to meet the most urgent missions.
And, finally, they should expect that our analysis not only tell policymakers about what is uppermost on their minds, but also alert them to things
that have not yet reached their in boxes [45].
Endnotes
[1]
Kam, E., Surprise Attack, Boston: Harvard University Press, 1988, p. 120.
236
[2] Foreign Missile Threats: Analytic Soundness of Certain National Intelligence Estimates,
U.S. Government Accounting Office, B-274120, August 30, 1996, accessed on-line
in December 2001 at http://www.house.gov/hasc/openingstatementsandpressreleases/
104thcongress/gaonie.pdf.
[3] Commission to Assess the Ballistic Missile Threat to the United States, Side Letter to
the Rumsfeld Commission Report, March 18, 1999. This unclassified letter was prepared
subsequent to the 1998 formal report to specifically articulate the commissions concerns
about intelligence analysis processes.
[4] A Compendium of Analytic Tradecraft Notes, Volume I (Notes 110), Washington, D.C.:
CIA, 1995. Note 3 addresses the means of articulating assumptions, note 4 addresses the
methods to articulate alternative outcomes (hypotheses), and note 5 addresses the methods
to depict facts and sourcing in intelligence judgments.
[5] McCarthy, M., The Mission to Warn: Disaster Looms, Defense Intelligence Journal,
Vol.7, No.2, 1998, p. 21.
[6] Note that the figure includes six steps at the top while the intelligence cycle introduced in
Chapter 2 has five steps. The planning and direction step has been divided into planning
and tasking in the table to facilitate discussion of the models.
[7] Wohl, J. G., Force Management Decision Requirements for Air Force Tactical Command and Control, IEEE Trans. Systems, Man, and Cybernetics, Vol. SMC-11, No. 9,
September 1981, pp. 618639. For a description of the application of the SHOR model
in automated intelligence data fusion, see, Waltz, E. L., and D. M. Buede, Data Fusion
and Decision Support for Command and Control, IEEE Transactions on Systems, Man,
and Cybernetics, Vol. SMC-16, No. 6 NovemberDecember 1986), pp. 865879.
[8] Boyd, J. R., The Essence of Winning and Losing, unpublished briefing, January 1996.
For an overview of command models, see Appendix: Alternative Models of Command
and Control, Command Concepts: A Theory Derived from the Practice of Command and
Control, MR-775-OSD, RAND, 1999. For commentaries on the contributions of John
Boyd, see: Hammond, G. T., The Mind of War: John Boyd and American Security, Washington D.C.: Smithsonian Institute Press, 2001; and Coram, R., Boyd: The Fighter Pilot
Who Changed the Art of War, Boston: Little, Brown, 2002.
[9] This model has also been called TCPED to include collection. For a discussion of the
Multi-INT TPED process, see Section 14 in The Information Edge: Imagery Intelligence
and Geospatial Information In an Evolving National Security Environment, Report of the
Independent Commission on the National Imagery and Mapping Agency, Washington,
D.C., January 9, 2001.
[10] The JDL model is described in further detail in Chapter 8.
[11] The concept and figure is adapted from Minsky, M., Common-Sense Based Interfaces,
Communications of the ACM, Vol. 43, No. 8, p. 71. Minsky first published a version of
this chart in July 1992 in Toshiba Review, Vol. 47, No. 7, accessed on-line on August 14,
1998, at http://minsky,www.media.mit.edu/people/minsky/papers/CausalDiversity/html.
[12] Figure adapted from Minsky, M. Common-Sense Based Interfaces, Communications of
the ACM, Vol. 43, No. 8. Used by permission from ACM.
[13] Weick, K., Sensemaking in Organizations, Thousand Oaks, CA: Sage, 1995.
237
[14] Damasio, A. R., Descartes Error: Emotion, Reason and the Human Brain, New York: Putnam, 1994.
[15] Davis, J., Combating Mind-Set, Studies in Intelligence, Vol. 36, No. 5, 1992, p. 33.
[16] Heuer Jr., R. J., Psychology of Intelligence Analysis, Washington D.C.: CIA Center for the
Study of Intelligence, 1999. This table summarizes the biases described in Chapters 10,
11, and 12.
[17] Gilovich, T., How We Know What Isnt So, New York: Free Press, 1991.
[18] Davis, J., Combating Mind Set, Studies in Intelligence, Vol. 36, No. 5, 1992, pp. 3338.
[19] See Symposium on the Psychology of Intelligence, in Bulletin of the Center for the Study
of Intelligence, Issue 11, Summer 2000, p. 1.
[20] Heuer Jr., R. J., Psychology of Intelligence Analysis, Chapter 1, Washington D.C.: CIA Center for the Study of Intelligence, 1999.
[21] Jones, M., Thinkers Toolkit, New York: Three Rivers Press, 1995, pp. 1246.
[22] The U.S. Joint Military Intelligence College, for example, emphasizes the importance of
applying structured methodologies. See, Brei, W., Getting Intelligence Right: The Power of
Logical Procedure, Occasional Paper 2, Joint Military Intelligence College, Washington
D.C., January 1996, and Folker, R. D., Intelligence Analysis in Theater Joint Intelligence
Centers: An Experiment in Applying Structured Methods, Occasional Paper 7, Joint Military
Intelligence College, Washington D.C., January 2000.
[23] The Chances of an Imminent Communist Chinese Nuclear Explosion, Special National
Intelligence Estimate, SNIE-13-4-64, August 26, 1964, in Rufner, K. C. (ed.), Corona:
Americas First Satellite Program, Washington D.C.: CIA Center for the Study of Intelligence, 1995, p. 239.
[24] Toulmin, S. E., The Uses of Argument, Cambridge, England: Cambridge University Press,
1958.
[25] Schum, D. A., Evidence and Inference for the Intelligence Analyst, Vols. I and II, Lanham,
MD: University Press of America, 1987; this text was authored while Schum was a scholar
in residence at the CIA.
[26] Schum, D. A., The Evidential Foundations for Probabilistic Reasoning, Evanston IL: Northwestern University Press, 2001. The brief introduction to inferential networks in this section is based on Schums exhaustive treatment, but does not approach the many critical
nuances of the theory developed by Schum. The reader is encouraged to turn to Schums
works for the details necessary to implement inferential nets.
[27] See, for example: Waltz, E., and J. Llinas, Multisensor Data Fusion, Norwood MA: Artech,
1990; Hall, D. L., Mathematical Techniques in Multisensor Data Fusion, Artech House:
Boston, 1992; Antony, R., Principles of Data Fusion Automation, Boston: Artech House,
1995; Pearl, J., CAUSALITY: Models, Reasoning, and Inference, Cambridge, England:
Cambridge University Press, 2000; Hall, D. L. and J. Llinas (eds.), Handbook of Multisensor Data Fusion, Boca Raton: CRC Press, 2001.
[28] Heuer, R. J., Jr., Psychology of Intelligence Analysis, Chapter 4, Strategies for Analytic
Judgment, Washington D.C.: CIA Center for the Study of Intelligence, 1999.
238
[29] See Anderson, T., and William T., Analysis of Evidence: How to Do Things with Facts Based
on Wigmores Science of Judicial Proof, Evanson, IL: Northwestern University Press, 1998.
Wigmores original presentation was in: Wigmore, J.H., The Science of Judicial Proof, Boston: Little Brown, 1937.
[30] This process is adapted from the eight-step process in Heuer, R. J., Jr., Psychology of Intelligence Analysis, Chapter 8: Analysis of Competing Hypotheses. See also Sawka, K.,
Competing Hypothesis Analysis, Competitive Intelligence, Vol. 2, No. 3, JulySept.
1999, pp. 3738.
[31] Daniel, D. C., and K. L. Herbig, Propositions on Military Deception, in Strategic Military Deception, Daniel, Donald, C., and Herbig, K. L., (eds.), New York: Pergamon, 1982,
p. 5.
[32] For a discussion of new D&D challenges, see Wirtz, J. J., and R. Godson, Strategic Denial
and Deception: The 21st Century Challenge, New Brunswick, NJ: Transaction Publishers,
2002.
[33] Agent-of-influence operations carry out the subornation (knowing or unwitting) of a person who will use their position, influence, power, or credibility to promote the objectives
of a foreign power.
[34] Schultz, R. H., and R. Goodson, Dezinformatsia: Active Measures in Soviet Strategy, Washington D.C.: Pergamon-Brasseys, 1984, p.16; see also Bittman, Ladislav, The KGB and
Soviet Disinformation, Washington D.C.: Pergamon-Brasseys, 1985.
[35] See Herbig, D., D., and K. Herbig, Strategic Military Deception, New York: Pergamon
Press, 1982; Wohlstetter, R., Pearl Harbor: Warning and Decision, Palo Alto, CA: Stanford
University Press, 1962; and Hesketh, R. (with foreword by Nigel West), Fortitude: The
D-Day Deception Campaign, London: St Ermins Press, 1999.
[36] Jervis, R., Perception and Misperception in International Politics, Princeton, NJ: Princeton
University Press, 1976.
[37] Quotation by Adm. Jeremiah in news conference on recommendations from the Study of
U.S. Intelligence Community Performance in Detecting Indian Nuclear Testing, June 2,
1998, released CIA Public Affairs Staff on June 4, 1998.
[38] May, E. R., Capabilities and Proclivities, in Knowing Ones Enemies: Intelligence Assessment before the Two World Wars, May, Ernest R. (ed.), Princeton, NJ: Princeton University Press) 1984, pp. 537538.
[39] Heuer, R., Cognitive Factors on Deception and Counterdeception, in Strategic Military
Deception, Daniel, Donald C., and Katherine L. Herbig (eds.), New York: Pergamon,
1982, p. 61. A prevision version of the chapter was published as Strategic Deception and
Counterdeception, International Studies Quarterly, Vol. 25, No. 2, June 1981,
pp. 294327.
[40] Waltz, E., Employing Data Fusion Tools within Intelligence Community Analysis, in
Proc. of National Symp. on Sensor and Data Fusion, August 2002.
[41] Whaley, B., Strategem: Deception and Surprise in War, unpublished monograph, MIT,
1969.
239
[42] Whaley, B., Strategem: Deception and Surprise in War, unpublished monograph, MIT,
1969, p. 147.
[43] Handel, M., I., Intelligence and Deception, in Gooch, J., and A. Perlmutter, Military
Deception and Strategic Surprise, London: Frank Cass, 1982, p. 144.
[44] Harris, W. R., On Countering Strategic Deception, R-1230-ARPA, November 1973,
pp. 3350.
[45] Remarks of DCI George J. Tenet, Opening Remarks, The Conference on CIAs Analysis of the Soviet Union, 19471991, Princeton University, March 8, 2001, accessed
on-line on August 21, 2001, at http://www.odci.gov/cia/public_affairs/speeches/
dci_speech_03082001.html.
7
Knowledge Internalization and
Externalization
The process of conducting knowledge transactions between humans and computing machines occurs at the intersection between tacit and explicit knowledge,
between human reasoning and sensemaking, and the explicit computation of
automation. The processes of externalization (tacit-to-explicit transactions) and
internalization (explicit-to-tacit transactions) of knowledge, however, are not
just interfaces between humans and machines; more properly, the intersection is
between human thought, symbolic representations of thought, and the observed
world. When an analyst writes, sketches, types on a keyboard, or explains some
experience or conceptual mental model in the symbols of language, externalization is taking place. When collected sensor data, message traffic, a foreign television interview, or a computer visualization of multidimensional data is viewed
and mentally absorbed by an analyst, internalization is taking place. Both
processes conduct a transaction of information between the human mind (and
emotion) and another explicit symbolic abstraction that represents the real
world. In this chapter, we examine externalization and internalization in the
intelligence workflow and the tools that support the processes.
242
the spiral deal with analyst interaction with explicit information; each activity has
interactions between the analyst and supporting cognitive services (or tools)
unique to each phase. The workflow in these three phases of the spiral includes
distinguishable tacit activities in the analysts mind and complementary tools or
services to support those activities (Figure 7.1).
Externalization
CC
To socialization
Query
Explicitly describe
problem; collect
explicit data
Combine
Refine
Correlate and combine data
Set automated reasoning Organize, correlate,
parameters
combine, detect,
View correlated data
discover explicit
Induce new patterns;
data
deduce known patterns
Internalize
Refine
Create possible meanings
Create and decide
Synthesize hypotheses
between
immerse in explicit models
explanations
of the
and simulations
data
Explore implications
Commit to analytic judgments
Retrieve
All-source
data
Set up
View
Focus
Create
Explore
Decide
Analytic
judgments
Interactive
analytic
tools
Auto data
fusion
and
mining
Hypothesis
model-sim
Decision
models
Publish results
243
This explicit-explicit transfer process correlates and combines the collected data
in two ways:
1. Interactive analytic tools. The analyst uses a wide variety of analytic
tools (discussed further in this chapter) to compare and combine data
elements to identify relationships and marshal evidence against
hypotheses.
2. Automated data fusion and mining services. Automated data combination services (discussed further in Chapter 8) also process highvolume data to bring detections of known patterns and discoveries of
interesting patterns to the attention of the analyst.
While the analyst is using these interactive tools and automated services to
combine explicit data, the analyst also observes the results and continues to create and modify emerging tacit mental models that explain and represent meaning contained in the data.
Internalization
244
Data tier
All
sources
Capture,
filter
services
Operational
processing
services
Operational
data stores
Tailored point,
narrow-and
broadcast
distribution
Portal
Index, query,
and retrieval
Data
warehouse
Extract,
transform
and load
Digital
production and
distribution
User tier
Cognitive
(analytic)
services
and
Data
mart of
specialized
databases
Collaboration
services
245
3. Cognitive (analytic) services. The analysis-synthesis and decisionmaking processes described in Chapters 5 and 6 are supported by cognitive services (thinking-support tools).
4. Collaboration services. These services, described in Chapter 4, allow
synchronous and asynchronous collaboration between analytic team
members.
5. Digital production services. Analyst-generated and automatically created
dynamic products are produced and distributed to consumers based
on their specified preferences.
6. Workflow management. The workflow is managed across all tiers to
monitor the flow from data to product, to monitor resource utilization, to assess satisfaction of current priority intelligence requirements, and to manage collaborating workgroups.
Subsequent sections in this chapter describe the search and retrieval tools
that are key to the externalization process (Section 7.2), as well as the analytic
services that support interactive combination, hypothesis modeling, and decision making used in the internalization process (Section 7.3). Digital production processes are described in Section 7.4. The means of tacit-explicit
interaction between the human and analyst and these services is described in
Section 7.5.
Data Storage
pendent data stores for imagery, text, video, geospatial, and special
246
technical data types. These data types are served by an equally high
number of specialized applications (e.g., image and geospatial analysis
and signal analysis).
Legacy. Storage system designers are confronted with the integration of
existing (legacy) and new storage systems; this requires the integration
of diverse logical and physical data types.
Federated retrieval and analysis. The analyst needs retrieval, application,
and analysis capabilities that span across the entire storage system.
These storage and application integration challenges are faced by most
business enterprises and require storage structure and application integration
trade-offs that influence performance, scalability, and system cost.
Storage structure alternatives. Heterogeneous, or data-oriented, data-
Data
federation
Structure
Data fed
Approach
Data
marts and
warehouses
Characteristics
Maintains local
control, providing
global viewing
Integrates many
independent
systems
Extracttransform-load
to warehouse
Historical archival
analysis
Middleware
applications
Enterprise
application
integration
247
Synchronized
updates
Unified view from
multiple system
reads
Not very scalable
long-term archived analysis across all sources. Integration at the application level, or enterprise application integration (EAI), adds a layer of
custom middleware that translates, synchronizes, and updates data
from all heterogeneous stores across a layer of common applications.
7.2.2
Information Retrieval
248
2. Text query and retrieval is performed on both structured and unstructured text in multiple languages by a variety of natural language
search engines to locate text containing specific words, phrases, or
general concepts within a specified context.
Data query methods are employed within the technical data processing
pipelines (IMINT, SIGINT, and MASINT). The results of these analyses are
then described by analysts in structured or unstructured text in an analytic database for subsequent retrieval by text query methods. In this section, we briefly
introduce the IR query and retrieval (or question and answer) approaches that
deal with structured and unstructured text. Moldovan and Harabagiu have
defined a five-level taxonomy of Q&A systems (Table 7.1) that range from the
common keyword search engine that searches for relevant content (class 1) to
reasoning systems that solve complex natural language problems (class 5) [3].
Each level requires increasing scope of knowledge, depth of linguistic understanding, and sophistication of reasoning to translate relevant knowledge to an
answer or solution.
The first two levels of current search capabilities locate and return relevant
content based on keywords (content) or the relationships between clusters of
words in the text (concept). The performance of such retrieval is measured in
terms of precision (the ratio of relevant content retrieved to the total content
retrieved) and recall (the ratio of relevant content retrieved to the total relevant
content available). These IR search engines offer the use of Boolean expressions,
proximity searches to reduce the specificity around a word or concept, and
weighted searching to specify different weights on individual search terms. The
retrieved corpus may be analyzed to present the user with a taxonomy of the
retrieved content to aid the user to select the most relevant content. While class
1 capabilities only match and return content that matches the query, class 2
capabilities integrate the relevant data into a simple response to the question.
Class 3 capabilities require the retrieval of relevant knowledge and reasoning about that knowledge to deduce answers to queries, even when the specific
answer is not explicitly stated in the knowledge base. This capability requires the
ability to both reason from general knowledge to specific answers and provide
rationale for those answers to the user.
Class 4 and 5 capabilities represent advanced capabilities, which require
robust knowledge bases that contain sophisticated knowledge representation
(assertions and axioms) and reasoning (mathematical calculation, logical inference, and temporal reasoning). The DARPA High-Performance Knowledge Base
and Rapid Knowledge Formation research programs and the Cycorp CYC
knowledge base and inference engine target these classes of Q&A performance.
These problem-solving classes perform knowledge retrieval, rather than IR.
249
Table 7.1
A Taxonomy of Query and Retrieval Capabilities
Q&A Class
Level of Processing
2. Ontology-based
conceptual search
3. Advanced natural
language reasoning
4. Domain-specific
high performance
Integrated deductive-inductive
reasoning and the ability to model A: The solution is to
or simulate situations
250
The development of analytic and data integration tools will be one of the
most important and expensive areas for the analytic production community. Without such tools, the shrinking analytic workforce will have no
hope of managing the flood of new intelligence information or shifting
smoothly from one crisis or issue area to another. To achieve progress, we
must develop: 1) An automated analytic workflow process relying on
advanced analytic tools, such as visualization, search, processing, KM and
dynamic updating, 2) New tools that reveal connections, facilitate analytic
insights and deductions and streamline search by prioritizing information,
automatically populating databases, and integrating data [4].
relationships, event sequences, and temporal transactions (e.g., electronic, financial, or communication).
Link analysis. This involves automated exploration of relationships
251
1. Exploration
2. Reasoning
3. Sensemaking
4. Decision,
Judgment
Search, navigate,
organize, query, and
explore (browse) data
Objects: data/text;
massive volume > 1010
Objects: decisions; ~ 4
decision alternatives
Tools:
Tools:
Tools:
IR
Ontology creation
Data/text mining
(pattern discovery)
Extraction (content,
concepts, and
relationships)
Data/text fusion
(pattern detection and
content tracking)
Modeling and
simulation for
immersion and
exploration
Modeling and
simulation for COA and
consequence
comparison
Risk analysis
Conversion (content
translation)
Change detection
Data/text clustering
Problem-solving
knowledge retrieval
Structured
argumentation
Alternative decision
comparison
Summarize, abstract,
and categorize
Link analysis
Temporal-spatial
Filter, monitor database mapping and analysis
or Web site changes
Visualize and interact
with highdimensionality data
Utility analysis
Alternative hypothesis
comparison
Creativity support
Visualize and interact
with arguments
252
2.
3.
4.
5.
253
requirements, monitors tasking to meet the requirements, links evidence and hypotheses to those requirements, tracks progress toward
meeting requirements, and audits results;
Relevant data linking: maintains ontology of subjects relevant to the
intelligence requirements and their relationships and maintains a database of all relevant data (evidence);
Collaboration directory: automatically locates and updates a directory of
254
(2)
Workflow
management
(1)
Collaboration
workspace
Socialize
(11)
Digital
production
Heterogeneous
data stores and
sources
EEIs
(3)
Information Search
(4)
retrieval
criteria Taxonomy
summarize
translate
Externalize
(5)
View
drill-down
Kbase
(10)
Decision
support
(9)
ACH
matrix
Internalize
(8)
Financial
model
Combine
(6)
Text
mining
(7)
Link
analysis
search current holdings and external sources, retrieving relevant multimedia content. The analyst also sets up monitor parameters to continually check certain sources (e.g., field office cables and foreign news
sites) for changes or detections of relevant topics; when detected, the
analyst will be alerted to the availability of new information.
4. The IR tools also create a taxonomy of the collected data sets, structuring the catch into five major categories: Zehga organization (personnel), events, finances, locations, and activities. The taxonomy breaks
each category into subcategories of clusters of related content. Documents located in open-source foreign news reports are translated into
English, and all documents are summarized into 55-word abstracts.
5. The analyst views the taxonomy and drills down to summaries, then
views the full content of the most critical items to the investigation.
255
Selected items (or hyperlinks) are saved to the shared knowledge base
for a local repository relevant to the investigation.
6. The retrieved catch is analyzed with text mining tools that discover
and list the multidimensional associations (linkages or relationships)
between entities (people, phone numbers, bank account numbers, and
addresses) and events (meetings, deliveries, and crimes).
7. The linked lists are displayed on a link-analysis tool to allow the analyst to manipulate and view the complex web of relationships between
people, communications, finances, and the time sequence of activities.
From these network visuals, the analyst begins discovering the Zehga
organizational structure, relationships to other drug cartels and financial institutions, and the timeline of explosive growth of the cartels
influence.
8. The analyst internalizes these discoveries by synthesizing a Zehga
organization structure and associated financial model, filling in the
gaps with conjectures that result in three competing hypotheses: a centralized model, a federated model, and a loose network model. These
models are created using a standard financial spreadsheet and a network relationship visualization tool. The process of creating these
hypotheses causes the analyst to frequently return to the knowledge
base to review retrieved data, to issue refined queries to fill in the gaps,
and to further review the results of link analyses. The model synthesis
process causes the analyst to internalize impressions of confidence,
uncertainty, and ambiguity in the evidence, and the implications of
potential missing or negative evidence. Here, the analyst ponders the
potential for denial and deception tactics and the expected subtle
sprignals that might appear in the data.
9. An ACH matrix is created to compare the accrued evidence and argumentation structures supporting each of the competing models. At any
time, this matrix and the associated organizational-financial models
summarize the status of the intelligence process; these may be posted
on the collaboration space and used to identify progress on the workflow management tool.
10. The analyst further internalizes the situation by applying a decision support tool to consider the consequences or implications of each model on
counter-drug policy courses of action relative to the Zehga cartel.
11. Once the analyst has reached a level of confidence to make objective
analytic judgments about hypotheses, results can be digitally published to the requesting consumers and to the collaborative workgroup
256
257
Table 7.3
Analytic Tools Summary
Category
Tool Descriptions
Exploration
Information retrieval
retrieve relevant text
(Section 7.2.2)
Representative COTS
Tools
Pathfinder Query
Brio Intelligence
Verity Inc. Verity
Ontology creation
automated taxonomy
generation for corpus of
documents; automated
metatagging
Veridian ThemeLink
Business Objects
WebIntelligence
Extraction/conversion
parses text, extracts and
tags entities and events
Reasoning
support
Representative GOTS
Tools
Semio
Sequoia
Megaputer TextAnalyst
C-4-U Scout
Data miningperform
automated
multidimensional clustering
to discover patterns and
correlated data
SPSS Clementine
Text miningperform
Pathfinder Matrix
automated linking of
conceptually and
semantically related textual
content
RetrievalWare
Autometric InSight
SAIC KnowledgeBoard
Commercial summarization
and tracking tools
Change detectiondetect
anomalies or temporal
changes in content
Custom tools
NextLabs TrackEngine
Cartia ThemeScape
Caesius WebQL
Autonomy
258
Representative COTS
Tools
Category
Tool Descriptions
Reasoning
support
Link analysisperform
Pathfinder MATRIX
automated discovery of relationships between multimedia data and text objects
i2 Analysts Notebook
Temporal mappingmap
events and entities to
chronological timelines
AFRL WebTas
i2 Analysts Notebook
Spatial mappingmap
physical objects to spatial
database; register and
overlay spatial information
MITRE GeoNODE
Pathfinder MapViewer
Knowledge retrieval
performs automated Q&A
deduction on large-scale
knowledge base
SMU LASSO
Orion OrionMagic
WisdomBuilder
DARPA/GMU Disciple
Decision
support
Visual Analytics
VisuaLinks
DiscoveryMachine
Axion Idea Processor
Pathfinder CAMEO
Wincite eWincite
Structured argumentation
Veridian CIM
construct and test logical SRI SEAS
arguments
Knowledge Industries
DXpress
Modeling and
simulationapply
simulation to evaluate
alternative COAs
Military simulations:
JWARS, NETWARS
Alternative decision
Tools to display ACHs
comparisonperforms
quantitative sensitivity and
comparative analysis of risk
and utility
Expert Choice
Lumina Analytica
259
Analyst 2
Externalize Socialize
2
Combine Internalize
(2 )
Information
retrieval tools
(1)
Collaboration
tools
Tacit
capture
Socialize Externalize
1
Internalize Combine
(2 )
Information
retrieval tools
Retrieved
data
(3)
Analytic
tools
(3)
Analytic
tools
Hypotheses
(4)
Hypothesis,
decision tools
Intelligence results
(4)
Hypothesis,
decision tools
data and judgments. This trend follows the third-wave transition (described in
Chapter 1) that moves from mass production of a standard report to all consumers, to mass customization of products; intelligence consumer portals may be
customized to tailor intelligence delivery to each individual consumers interests.
Digital production processes employ content technologies that index, structure, and integrate fragmented components of content into deliverable products. In the intelligence context, content includes:
1. Structured numerical data (imagery, relational database queries) and
text [e.g., extensible markup language (XML)-formatted documents]
as well as unstructured information (e.g., audio, video, text, and
HTML content from external sources);
2. Internally or externally created information;
3. Formally created information (e.g., cables, reports, and imagery or signals analyses) as well as informal or ad hoc information (e.g., e-mail,
and collaboration exchanges);
260
Table 7.4
Production Content Management Service Categories
Services
Content Creation
Functions Performed
Creation and update of content taxonomies (knowledge ontologies)
Structured authoring of content per standard content descriptions
Analyst-appending metadata tags to describe external content
Automated search and retrieval of relevant external content
Content Translation
Content Integration
Personalized Portal
Services
261
categories to be uniquely tagged. The U.S. IC established a metadata and metadata markup working group to establish a communitywide mandated XML
model to support interoperability of intelligence content across intelligence producers and consumers [11]. Intelligence standards being developed include an
intelligence information markup language (ICML) specification for intelligence
reporting and metadata standards for security, specifying digital signatures
(XML-DSig), security/encryption (XML-Sec), key management (XML-KMS),
and information security marking (XML-ISM) [12]. Such tagging makes the
content interoperable; it can be reused and automatically integrated in numerous ways:
Numerical data may be correlated and combined.
Text may be assembled into a complete report (e.g., target abstract, tar-
to suit unique consumer needs (e.g., portal target summary format, personal digital assistant format, or pilots cockpit target folder format).
In a typical intelligence application, SQL queries may be used to collect
SIGINT results from a relational database, and then a standard document type
description (DTD) is used to define the structure of an XML document.
Figure 7.6 illustrates a simple XML SIGINT content example, with tagged
data and metadata that can be read, processed, and integrated with other XML
<?xml version="1.0" ?>
<!DOCTYPE siginttype096report SYSTEM "st096.dtd">
<siginttype096report>
<collect>
<collect-source>Vandal033</collect-source>
<collect-time-zulu year="2001" month="10" day="15"
hour="7" minute="44" second="12.35" />
<platform>Hawk055</platform>
<mission>Condor Talon</mission>
<metadata-text>Excellent 10.3 second capture of emitter
1045-P during data transfer</metadata-text>
<signal data parameterz1="1032.67" parameterz2="12.345"
parameterz3="10.15" parameterz4="0.00" />
<singallibrarylocation url="vandal.033.sig0234.23.2" />
<userlist>Simplex, Honcho5, Pico299, Gala</userlist>
<targetname>Quintexiplex</targetname>
<processors preproc="K25" Postproc="Vector33" />
</collect>
...
</siginttype096report>
262
content [13]. Finally, a document object model (DOM) tree can be created
from the integrated result to transform the result into a variety of formats (e.g.,
HTML or PDF) for digital publication.
The analysis and single-source publishing architecture adopted by the U.S.
Navy Command 21 K-Web (Figure 7.7) illustrates a highly automated digital
production process for intelligence and command applications [14]. The production workflow in the figure includes the processing, analysis, and dissemination steps of the intelligence cycle:
1. Content collection and creation (processing and analysis). Both quantitative technical data and unstructured text are received, and content is
extracted and tagged for subsequent processing. This process is applied
to legacy data (e.g., IMINT and SIGINT reports), structured intelligence message traffic, and unstructured sources (e.g., news reports and
Templates
New
data
Document
type
descriptions
Product
packager
XML
tagged
Data
mart
Composite
XML-tagged
products
Presentation
agents
Style library
(4) Publication-distribution
HTML, other formats
Data
sources
Existing infrastructure
Web server
Browser
263
intelligence e-mail). Domain experts may support the process by creating metadata in a predefined XML metadata format to append to
audio, video, or other nontext sources. Metadata includes source, pedigree, time of collection, and format information. New content created
by analysts is entered in standard XML DTD templates.
2. Content applications. XML-tagged content is entered in the data mart,
where data applications recognize, correlate, consolidate, and summarize content across the incoming components. A correlation agent
may, for example, correlate all content relative to a new event or entity
and pass the content on to a consolidation agent to index the components for subsequent integration into an event or target report. The
data (and text) fusion and mining functions described in the next
chapter are performed here.
3. Content management-product creation (production). Product templates
dictate the aggregation of content into standard intelligence products:
warnings, current intelligence, situation updates, and target status.
These composite XML-tagged products are returned to the data mart.
4. Content publication and distribution. Intelligence products are personalized in terms of both style (presentation formats) and distribution
(to users with an interest in the product). Users may explicitly define
their areas of interests, or the automated system may monitor user
activities (through queries, collaborative discussion topics, or folder
names maintained) to implicitly estimate areas of interest to create a
users personal profile. Presentation agents choose from the style
library and user profiles to create distribution lists for content to be
delivered via e-mail, pushed to users custom portals, or stored in the
data mart for subsequent retrieval. The process of content syndication
applies an information and content exchange (ICE) standard to allow
a single product to be delivered in multiple styles and to provide
automatic content update across all users.
The users single entry point is a personalized portal (or Web portal) that
provides an organized entry into the information available on the intelligence
enterprise.
The portal provides an organized single-point access to a broad array of
services (Table 7.5) that are tailored to the users personal profile [15].
The U.S. IC chief information officer (CIO) introduced a pilot project in
2000 to provide automated digital production of an Intel Gazette product that
may be personalized to individual consumers to deliver daily intelligence articles
that are categorized, clustered (by topic similarity), and summarized [16].
264
Service Category
Accessible Services
Collaboration, productivity,
production
Enterprise transactions
Analysis-synthesis process
Personalized news
Information Visualization
Edward Tufte introduced his widely read text Envisioning Information with the
prescient observation that, Even though we navigate daily through a perceptual
world of three dimensions and reason occasionally about higher dimensional
arena with mathematical ease, the world portrayed on our information displays
265
of associated data, patterns of linkages and relationships, trends (temporal behavior), and outlying data;
Combine the information by registering, mathematically or logically
Analyst-Agent Interaction
266
Visualization
Methods
Text visualization
Statistical data
visualization
Imagery and video
visualization
Functions
Example Intelligence
Applications
Virtual reality
267
7.6 Summary
The analytic workflow requires a constant interaction between the cognitive and
visual-perceptive processes in the analysts mind and the explicit representations
of knowledge in the intelligence enterprise. This chapter has reviewed the interactive tools that can aid the analyst in all phases of the knowledge-creation spiral
268
and can also help the analyst manage the workflow process itself. Tools without
a methodology become a nuisance; this chapter has emphasized the necessary
understanding of how the wide variety of collaborative and analytic tools fit into
an overall methodology to translate raw intelligence feeds into finished, digitally
published intelligence. In the next chapter, we examine the automated services
that capture and combine explicit knowledge to further aid the analyst in coping
with massive volumes of arriving data.
Endnotes
[1] Thompson, J. R., R. Hopf-weichel, and R. E. Geiselman, The Cognitive Bases of Intelligence Analysis, Research Report 1362, U.S. Army Research Institute for the Behavioral
Sciences, January 1984.
[2] Information retrieval definition by the International Standards Organization, ISO/IEC
2382-1: 1993, Information Technology Vocabulary.
[3] The table is based on the general classes presented by Moldovan, D., et al., LASSO: A
Tool for Surfing the Answer Net, Proc. Of the Eighth Text Retrieval Conference
(TREC-8), NIST Special Pub. 500-246-2000, Table 5, p. 183.
[4] Strategic Investment Plan for Intelligence Community Analysis, Washington D.C.: CIA,
ADCI/AP200-01, p. 42.
[5] Intelligence Software Report 2002, Fuld & Co., Cambridge, MA, 2002. This report, first
issued in 2000 and updated in 2002, identified over 40 applicable packages, then reviewed
and scored the applicability of over 10 dedicated CI packages in detail.
[6] Druzdzel, M. J., and R. R. Flynn, Decision Support Systems, in Encyclopedia of Library
and Information Science, Kent, Allen (ed.), New York: Marcel Decker, 2000.
[7] For an introduction to normative decision support tools, see Clemen, R. T., Making Hard
Decisions: An Introduction to Decision Analysis, Belmont CA: Duxbury, 1996.
[8] See Pathfinder: Pathfinder for the Web v4.6 and Pathfinder Portal Product Summary,
Office of Naval Intelligence Pathfinder Support Office, Washington D.C., 2002; Mucks,
J., Web based Timeline Analysis System, WEBTAS, Air Force Research Laboratory,
March 26, 2002.
[9] Maybury, Mark T., Analytic Tools for the New Millenium, in Proc. of Annual JIVA
Conference, Albuquerque, NM, August 17, 2000.
[10] XML is a subset of the standard generalized markup language standard for creating
markup languages; it is recommended by the W3C Web Consortium for creating markup
languages for Web delivery.
[11] West, T. N., Metadata and Metadata Markup and Homeland Security, IC CIO, 21 August
2002.
[12] See http://www.xml.saic.com/icml/main/ic_core.html, accessed on-line on September 22,
2002.
269
[13] A DTD is a specification of the permissible data elements for a class of documents, the
parent-child relationships of the elements, and the order in which the elements may
appear.
[14] CINC-21Knowledge Management Objective Functional Architecture, U.S. Navy
SPAWAR System Center, San Diego, SD-256, August 2000. See also Command21
Knowledge Web, U.S. Navy SPAWAR System Center, San Diego, SD-400, July 2001.
[15] Portals are often categorized as: 1) corporate portals, which access the organizations intranet by employees (e.g., analysts and operations officers within an intelligence organization), 2) customer portals, which provide tailored services to the intelligence consumer
outside the organization intranet, and 3) vertical portals (vortals), which cross organization
boundaries (e.g., collection, processing, analysis, and operations organizations).
[16] The Intel Gazette, Brochure, Office of the Intelligence Community CIO, Washington
D.C., 2000.
[17] Tufte, E. R., Envisioning Information, Cheshire, CT: Graphics Press, 1990, p.12. This
book and Tuftes companions, The Visual Display of Quantitative Information and Visual
Explanations provide excellent examples of methods to increase the number of dimensions
that can be represented and density of data presented to the human perception on a flat
plane (escaping flatland).
[18] P1000 Science and Technology Strategy for Information Visualization, Washington D.C.:
CIA DS&T, Version 2, December 16, 1996, p. 18.
[19] Arnheim, R., Visual Thinking, Berkeley, CA: University California Press, 1969.
[20] Themescape: Award-Wining text Visualization, AAT Delivers, CIA Office of Advanced
Analytic Tools (AAT), Issue 1, Feb. 2000.
[21] This section presumes knowledge of agent technology. For an overview related to these
applications, see Parunak, H. Van Dyke, Go to the Ant: Engineering Principles from
Natural Multi-Agent Systems, Annals of Operations Research, 1975, pp. 69101; and
Parunak, H. Van Dyke, Agents in Overalls: Experiences and Issues in the Development
and Employment of Industrial Agent-Based Systems, International Journal of Cooperative
Information Systems, Vol. 9, No. 3, 2000 209227. The proceedings of the annual Workshops on Collaborative Information Agents (CIA) are published by Springer-Verlag (Berlin) beginning with the first workshop (CIA-97) in 1997. The proceedings summarize
research and development in critical areas necessary to make CIA technology viable: agent
languages and specifications, uncertainty management, mobility and security, and
human-agent conversation and dialogue.
[22] See Chen, J. R., Mathe, N., and S. Wolfe, Collaborative Information Agents on the
World Wide Web, Report of the Computation Sciences Division, NASA Ames Research
Center, May 6, 1998; see also Maes, Pattie, Agents that Reduce Work and Information
Overload, Communications of the ACM, Vol. 37, No. 7, July 1994.
[23] Maybury, M. T., Human Computer Interaction: State of the Art and Further Development in the International ContextNorth America, Proc. of Human Computer Interaction International Status Conference, Saarbruecken, Germany, October 2627, 2001.
8
Explicit Knowledge Capture and
Combination
In the last chapter, we introduced analytic tools that allow the intelligence analyst to interactively correlate, compare, and combine numerical data and text to
discover clusters and relationships among events and entities within large databases. These interactive combination tools are considered to be goal-driven
processes: the analyst is driven by a goal to seek solutions within the database,
and the reasoning process is interactive with the analyst and machine in a common reasoning loop. This chapter focuses on the largely automated combination processes that tend to be data driven: as data continuously arrives from
intelligence sources, the incoming data drives a largely automated process that
continually detects, identifies, and tracks emerging events of interest to the
user. These parallel goal-driven and data-driven processes were depicted as
complementary combination processes in the last chapter (Figure 7.1). The
automated processes described in this chapter are currently found in the processing phase of the intelligence cycle, especially the IMINT and SIGINT
piplelines, where high-volume, disparate sources are correlated and combined.
The data fusion and mining operations in these pipelines are performed on
numerical data; the subsequent multi-INT fusion and mining operations in the
analysis phase are more often performed on text reports generated from the
IMINT, SIGINT, and other source analyses. In all cases, the combination
processes help sources to cross-cue each other, locate and identify target events
and entities, detect anomalies and changes, and track dynamic targets (refer to
Figure 6.2).
271
272
273
Table 8.1
Knowledge Representation Layers
Knowledge
Abstraction
Layer
Examples
Upper ontology
( a, b ) a Event b Event
causes(a, b ) precedes (a, b )
Domain-specific
theories
Database of facts
Common-sense knowledge
Fundamental and formal calculii
of inference and mathematics,
problem solving, planning
Domain-specific relationships,
behaviors, models
Domain expertise
Air_Attack Act_of_War
Domain-specific instances of
entities and events
Mig-27 Fighter_Aircraft
Kporanta NATION_STATE
used-by (MIG-27, Kporanta)
Date 10-24-02 Attack Air_Attack
Used-in (Date 10-24-02 Attack, MIG-27)
the fact that an air attack is a subset of the concept act of war
(Air_Attack Act_of_War ), and that Kporanta is an instance of a
nation state (Kporanta Nation_State).
Logical operators define the relationship between propositions (refer to
Chapter 5 for an introduction to their use in basic reasoning) to enable
prepositional logic to perform deduction. The examples in the table
include the logical statement that there exists () no aircraft() that
function as both fighter and commercial: ( Aircraft) [Fighter
Commercial), and all men are mortal :x Man(x) Mortal(x).
Predicate operators make statements about entities and events; they
attribute properties, such as the fact that Kporanta employs Mig-27 aircraft: used-by (MIG-27, Kporanta) and allow manipulation by a predicate calculus.
Functional operators allow normal mathematical operations to be performed on quantitative arguments; for example Missile_Range =
F(alt,manueuver, K,p1, p2).
274
The structure of a high-performance knowledge base and associated inference engines to support the intelligence process is illustrated in Figure 8.1,
which is adapted from the basic RKF architecture [2]. The figure shows the two
stages of implementing and using such a knowledge base:
1. Knowledge-base creation is the initial (an ever ongoing) process of accumulating knowledge in the four layers of abstraction of knowledge in
the knowledge base. Knowledge-base experts encode fundamental
knowledge into the upper ontology and core theories layers. Domainspecific knowledge acquisition is performed by translating domain
experts natural language to formal expressions (described next), testing the inputs for consistency with prior knowledge, then diagramming the knowledge to allow reviewed, refined verification by the
author or the collaborating team.
2. Automated combination is then performed as the populated knowledge
base is coupled with inference engines to perform inductive, deductive, or abductive reasoning processes to combine current facts (e.g.,
current all-source intelligence feeds) with domain knowledge to infer
combined intelligence. Entities and events are extracted from current
all-source feeds and placed in the factual database layer, where they
are used to derive new intelligence for delivery to the analyst. The figure illustrates two data-driven automatic inference processes that are
Domain
knowledge
acquisition
Pattern
learning
2. Automated combination
Core theories
New (discovered) patterns
Domain-specific
theories and models
Known
patterns
Current
data
Database of facts
Incoming
all-source
intelligence
feeds
Extract
evidence
Data
fusion
(deduction)
Data
mining
(induction)
Detection
Discovery
Intelligence
analyst
275
described in subsequent sections in this chapter. A deductive datainformation fusion process deduces (detects) and reports to the analyst
the presence of entities, events, and situations of interest based on
known domain-specific patterns within the knowledge base. An
inductive data-information mining process induces (discovers) new,
general patterns (relationships) in the incoming data and proposes
these generalizations to the analyst for validation.
The data fusion and mining process are implemented in an inference
engine, which applies the fundamental inferential axioms that are encoded in
the core theory layer, guided by the problem-solving process in the core theory
layer with a knowledge of the general world provided in the upper ontology. Of
course, the system just described is a robust and ambitious, though not yet
operational, approach to general-purpose reasoning applied to the intelligence
domain. Even as this technology matures to provide powerful combination
capabilities for the analyst, current data fusion and mining technologies are providing automated combination within narrow domains; these capabilities are
described in the following section.
276
Knowledge Created
Reasoning Process
Knowledge Patterns
Used to Detect/
Discover Knowledge
Detection/Discovery
Process
Object of Detection/
Discovery Process
and Knowledge
Gained
Applications
Event detection
characterize data understood, while data fusion applies those descriptions to detect the presence of patterns in new data.
Uncertainty in inferred knowledge. The data and reference patterns are
277
Data Fusion
278
causal or relationship model, then applying all-source observation to detect entities and events to estimate the properties of the model. Consider the levels of
representation in the simple target-observer processes in Figure 8.2 [6]. The
adversary leadership holds to goals and values that create motives; these motives,
combined with beliefs (created by perception of the current situation), lead to
intentions. These intentions lead to plans and responses to the current situation;
from alternative plans, decisions are made that lead to commands for action. In
a hierarchical military, or a networked terrorist organization, these commands
flow to activities (communication, logistics, surveillance, and movements).
Using the three domains of reality terminology introduced in Chapter 5, the
motive-to-decision events occur in the adversarys cognitive domain with no
observable phenomena. The actions of explicitly recording and communicating
these plans to operating units occur in the symbolic domain (communicated via
human or machine language over communication systems) and the actions of
those units appear in the physical domain. The data-fusion process uses observable evidence from both the symbolic and physical domains to infer the operations, communications, and even the intentions of the adversary.
The adversarys causal flow down is shown in the figure on the left, and
the upward-flowing inference chain of data fusion is shown on the right. The
deductive process is partitioned into levels to match the abstractions of the
Beliefs
The Target
The Observer
Goals, values
Motives
Own COAs
Intentions
Consequences
Plans, responses
Adversary plans,
decision
Decisions
Activities
Level 2
structure
behavior
Actions, reactions
Observable
Phenomena
Level
0
Level
1
279
adversary (target) behavior. Activities in the physical world are estimated at levels 0 and 1 by directly observing, detecting, identifying, and tracking entities
over time. Level 2 estimates the composite situation of all entities and their relationships (e.g., aggregating entities into military units or terrorist cells and their
command structures), and the temporal behavior of their activities. From this,
the adversarys plans and intentions are estimated and compared with the
observers goals and alternative COAs to forecast consequences.
The emerging concept of effects-based military operations (EBO) requires
intelligence products that provide planners with the ability to model the various
effects influencing a target that make up a complex system. Planners and operators require intelligence products that integrate models of the adversary physical
infrastructure, information networks, and leadership and decision making [7].
The U.S. DoD JDL has established a formal process model of data fusion
that decomposes the process into five basic levels of information-refining
processes (based upon the concept of levels of information abstraction) [8]:
Level 0: Data (or subobject) refinement. This is the correlation across sig-
280
Level of abstraction
Knowledge (intelligence)
information understood
and explained
User
Level 3
threat refinement
Situation base
Level 2
situation refinement
Information
Level 4
Data placed in context, resource
indexed and organized management
Data
Measurements
and observations
Object base
L1 Fusion
process
control
Level 1
object refinement
Preprocessing
Level 0
control
data
refinement
Sensor and
source requests
S1 S2 S3
Measurements,
observations
from multiple
sources, Sn
S4 Sn
Figure 8.3 Data fusion is a process of deductive reasoning to correlate and combine multiple sources of data in order to understand a complex physical process.
Raw data from sensors may be calibrated, corrected for bias and gain errors, limited (thresholded), and filtered to remove systematic noise sources. Object
detection may occur at this pointin individual sensors or across multiple sensors (so-called predetection fusion). The object-detection process forms observation reports that contain data elements such as observation identifier, time of
measurement, measurement or decision data, decision, and uncertainty data.
8.2.1.2 Level 1: Object Refinement
Sensor and source reports are first aligned to a common spatial reference (e.g., a
geographic coordinate system) and temporal reference (e.g., samples are propagated forward or backward to a common time.) These alignment transformations place the observations in a common time-space coordinate system to allow
281
c=
w x
i
i 1 =1
Where;
wi = weighting coefficient for attribute xi.
xi = ith correlation attribute metric.
Values of xi may include spatial distances (how close were the physical
locations of the observations?), statistical distances (how similar were the measurements?), or spectral compatibility (how feasible was the measurement to
occur from a common source?). The weighting coefficients, wI, , may be used to
weight each contribution by relative importance or by absolute strength of contribution (e.g., inverse weighting by covariance statistics). The correlation metric may be used to make a hard decision (an association), choosing the most
likely parings of observations, or a deferred decision, assigning more that one
hypothetical paring and deferring a hard decision until more observations arrive.
Once observations have been associated, two functions are performed on each
associated set of measurements for common object:
1. Tracking. For dynamic targets (vehicles or aircraft), the current state of
the object is correlated with previously known targets to determine if
the observation can update a model of an existing model (track). If the
newly associated observations are determined to be updates to an existing track, the state estimation model for the track (e.g., a Kalman filter) is updated; otherwise, a new track is initiated.
2. Identification. All associated observations are used to determine if the
object identity can be classified to any one of several levels (e.g.,
friend/foe, vehicle class, vehicle type or model, or vehicle status or
intent).
8.2.1.3 Level 2: Situation Refinement
282
synoptic understanding of all objects, in their space-time context, provides situation knowledge, or awareness.
8.2.1.4 Level 3: Impact (or Threat) Refinement
This process provides feedback control of the collection and processing activities
to achieve the intelligence requirements. At the top level, current knowledge
(about the situation) is compared to the intelligence requirements required to
achieve operational objectives to determine knowledge shortfalls. These shortfalls
are parsed, downward, into information, then data needs, which direct the future
acquisition of data (sensor management) and the control of internal processes.
Processes may be refined, for example, to focus on certain areas of interest, object
types, or groups. This forms the feedback loop of the data-fusion process.
Current Intelligence
Threat cell activities
Communication traffic levels
Estimated intent, capacity
Constraints
World situation
Alert and security levels
Logistics required
Observability of cells
283
Data Mining
Data mining is the process by which large sets of data (or text in the specific
case of text mining) are cleansed and transformed into organized and indexed
sets (information), which are then analyzed to discover hidden and implicit,
but previously undefined, patterns. These patterns are reviewed by domain
experts to determine if they reveal new understandings of the general structure
and relationships (knowledge) in the data of a domain under observation. The
Table 8.3
Distinctions Between the Data Fusion Processing Levels
Level 0: Data
Refinement
Level 1: Object
Refinement
Level of
Data (measureInformation ments and
Abstractions observations)
Objects (events
and entities)
Functions
Performed
Signal estimation
Object estimation
Group estimation
Composite
sensor
detection
Detection
Association
Combination
Tracking
Classification
Impact prediction
Group
detection
(aggregation)
Model
associations and
behavior
Group
association
Predict future
behavior
(courses of
action)
Group
combination
Group tracking
Group classify
Temporal
Focus
A single
observation
A period: a small
sequence of
observations
General
Output
Products
Object reports of
entities or events
Assess impact
and implications
to objective(s)
284
data-mining process can also be applied to text, where the discovered patterns
may include clusters of related articles, linked authors, or topics.
The object of discovery is a pattern, which is defined as a statement in
some language, L, that describes relationships in subset Fs of a set of data, F,
such that:
1. The statement holds with some certainty, c;
2. The statement is simpler (in some sense) than the enumeration of all
facts in Fs [13].
This is the inductive generalization process described in Chapter 5. Mined
knowledge, then, is formally defined as a pattern that is interesting, according to
some user-defined criterion, and certain to a user-defined measure of degree. As
an example, consider the following case:
Terrorist organization patterns:
285
The data mining literature has predominantly addressed business applications that seek to locate economic or buying patterns of warehouses of data,
including point-of-sales data [15]. The increased availability of warehoused data
and the potential economic benefits of improved knowledge of purchasing patterns have spurred significant research and development in the mining process.
The term is used to refer to a range of processes, from manual analysis of data
using visualization tools alone, to automated techniques that navigate and
explore data searching for interesting patterns.
The Cross Industry Standard Process for Data Mining (CRISP-DM) is
emerging as a standard reference model, as the JDL model is for data fusion
[16]. The general functions of data mining can be structured (Figure 8.5) to
Level of abstraction
Knowledge
Information
understood and
explained
Refine
queries
Visualization
Discovery
modeling
Information
Data placed in
context, indexed,
and organized
Query,
selection,
transform,
and
warehousing
refinement
feedback
Data
mining
operation
Select and
transform control
Cleansing filters
Data
Measurements
and observations
Dimensional
presentations
Refined
requests for
additional data
Transformed
information base
Data selection
and transform
Data cleansing
Data
warehouse
Figure 8.5 Data mining is an inductive process of evaluating data to locate patterns in the
data that explain previously unknown general relationships in the underlying
physical processes.
286
the warehouse, initially in the native format of the source. One of the
chief issues facing many mining operations is the reconciliation of
diverse database formats that have different formats (e.g., field and
record sizes and parameter scales), incompatible data definitions, and
other differences. The warehouse collection process (flow in) may mediate between these input sources to transform the data before storing in
common form [20].
Data cleansing. The warehoused data must be inspected and cleansed to
identify and correct or remove conflicts, incomplete sets, and incompatibilities common to combined databases. Cleansing may include
several categories of checks:
1. Uniformity checks verify the ranges of data, determine if sets exceed
limits, and verify that formats versions are compatible.
2. Completeness checks evaluate the internal consistency of data sets to
ensure, for example, that aggregate values are consistent with individual data components (e.g., verify that total sales is equal to sum of all
sales regions, and that data for all sales regions is present).
3. Conformity checks exhaustively verify that each index and reference
exists.
4. Genealogy checks generate and check audit trails to primitive data to
permit analysts to drill down from high-level information.
Data selection and transformation. The types of data that will be used for
mining are selected on the basis of relevance. For large operations, initial mining may be performed on a small set, then extended to larger
sets to check for the validity of abducted patterns. The selected data
may then be transformed to organize all data into common dimensions
and to add derived dimensions as necessary for analysis.
Data mining operations. Mining operations may be performed in a
287
to fit the data patterns detected. This is the proscriptive aspect of mining: modeling the historical data in the database (the past) to provide a
model to predict the future. The model attempts to abduct a generalized description that explains discovered patterns of interest and, using
statistical inference from larger volumes of data, seeks to induct generally applicable models. Simple extrapolation, time-series trends, complex linked relationships, and causal mathematical models are examples
of models created.
Visualization. The analyst uses visualization tools that allow discovery
Description
Clustering
Statistical analysis
Rule abduction
Deviation analysis
Neural abduction
288
8.2.3
289
Information
Data organized
and placed in
context
Data
Observations
and
measurements
Visualization, management
Visualization, validation
Impacts
Model
Discovery modeling
Data mining
search
operations
Object base
Data transform
Data cleansing
Knowledge
Information
understood
and explained
Data fusion
Level 3
impact
Situation
Situations
Level 2
situation refinement
Object base
Level 1
object refinement
Objects
Level 0
signal data refinement
Sensor Sensor Sensor
3
1
2
Figure 8.6 An integrated data mining and fusion architecture. (Source: [21]. 1998 IEEE;
used by permission.)
290
Models and simulations are inherently collaborative; their explicit representations (versus mental models) allow analytic teams to collectively assemble,
and explore the accumulating knowledge that they represent. They support the
analysis-synthesis process in multiple ways:
Evidence marshaling. As described in Chapter 5, models and simulations
291
Table 8.5
Typical Intelligence Modeling and Simulation Applications
Domain
Models
Simulations
Cognitive
Physical
292
8.3.1
The challenge of I&W demands predictive analysis, where the analyst is looking at something entirely new, a discontinuous phenomenon, an outcome that
he or she has never seen before. Furthermore, the analyst only sees this new pattern emerge in bits and pieces [30]. The text, Preventative Measures, reports on
a variety of M&S tools developed to provide warning of incipient crises (e.g.,
violent societal behavior, interstate conflict, or state failure) for the public policymakers, the IC, and DoD [31]. The tools monitor world events to track the
state and time-sequence of state transitions for comparison with indicators of
stress. These analytic tools apply three methods to provide indicators to analysts:
1. Structural indicator matching. Previously identified crisis patterns (statistical models) are matched to current conditions to seek indications
in background conditions and long-term trends.
2. Sequential tracking models. Simulations track the dynamics of events to
compare temporal behavior with statistical conflict accelerators in current situations that indicate imminent crises.
3. Complex behavior analysis. Simulations are used to support inductive
exploration of the current situation, so the analyst can examine possible future scenarios to locate potential triggering events that may cause
instability (though not in prior indicator models).
A general I&W system architecture (Figure 8.7), organized following the
JDL data-fusion structure, accepts incoming news feed text reports of current
situations and encodes the events into a common format (by human or automated coding). The event data is encoded into time-tagged actions (assault, kidnap, flee, assassinate), proclamations (threaten, appeal, comment) and other
pertinent events from relevant actors (governments, NGOs, terror groups). The
level 1 fusion process correlates and combines similar reports to produce a single
set of current events organized in time series for structural analysis of background conditions and sequential analysis of behavioral trends by groups and
interactions between groups. This statistical analysis is an automatic targetrecognition process, comparing current state and trends with known clusters of
unstable behaviors. The level 2 process correlates and aggregates individual
events into larger patterns of behavior (situations). A dynamic simulation tracks
the current situation (and is refined by the tracking loop shown) to enable the
analyst to explore future excursions from the present condition. By analysis of
the dynamics of the situation, the analyst can explore a wide range of feasible
futures, including those that may reveal surprising behavior that is not intuitiveincreasing the analysts awareness of unstable regions of behavior or the
potential of subtle but potent triggering events.
293
Event types
Political
Economic
Military
Societal
Event database
Preanalysis
Current
time
window
Time-tagged event
data
Source, time
Country, location
Event type
Actors
Impacts
Level 1 event
refinement
Correlate event
reports
Combine events
Level 2 situation
refinement
Correlate,
aggregate events
to situations
Structural and
sequential
analysis
Complex
behavior
simulation
Analyze
future
scenarios
Indicator
conditions
trends
Crisis warning
8.3.2
The complex behavior noted in the prior example may result from random
events, human free will, or the nonlinearity introduced by the interactions of
many actors. The most advanced applications of M&S are those that seek to
model environments (introduced in Section 4.4.2) that exhibit complex behaviorsemergent behaviors (surprises) that are not predictable from the individual contributing actors within the system. Complexity is the property of a
system that prohibits the description of its overall behavior even when all of the
components are described completely. Complex environments include social
behaviors of significant interest to intelligence organizations: populations of
nation states, terrorist organizations, military commands, and foreign leaders
[32]. Perhaps the grand challenge of intelligence analysis is to understand an
adversarys cognitive behavior to provide both warning and insight into the
effects of alternative preemptive actions that may avert threats. The U.S. DCI
has articulated this challenge:
To this day, Intelligence is always much better at counting heads than divining what is going on inside them. That is, we are very good at gauging the
size and location of militaries and weaponry. But for obvious reasons, we can
never be as good at figuring out what leaders will do with them. Leadership
analysis remains perhaps the most difficult of analytic specialties [33].
294
Nonlinear mathematical solutions are intractable for most practical problems, and the research community has applied dynamic systems modeling and
agent-based simulation (ABS) to represent systems that exhibit complex behavior [34]. ABS research is being applied to the simulation of a wide range of
organizations to assess intent, decision making and planning (cognitive), command and finances (symbolic), and actions (physical). The applications of these
simulations include national policies [35], military C2 [36], and terrorist
organizations [37]. As these technologies mature, such tools will increasingly aid
analysts in the study of the cognitive domain.
8.4 Summary
This chapter has introduced the potential contribution of automated reasoning
systems that can capture and apply explicit knowledge to locate critical information, focus attention to key issues, detect the presence of known patterns, or discover new patterns in massive volumes of incoming data. These tools will also
support the creation of complex models and simulations of physical, symbolic,
and cognitive systems of interest to the analyst, enabling the analyst to explore
and experience these targets to gain deeper understanding of their structure and
behavior. A recent Harvard Business Review article chronicled the promise of
Predicting the Unpredictable by agent-based simulation; this technology may
indeed enable just a degree of this capability for the analyst, too [38]. But these
intelligence models and simulations must always be subject to an evaluation of
appropriateness, validity, and usefulness; they can never be used blindly or
trusted implicitly. These combination tools must never be viewed as black boxes
that provide answers without explanation; rather, they must viewed as transparent boxes that allow the analyst to enter into a problem deeply to gain insight not
available by a more cursory examination [39].
Endnotes
[1]
Cohen, P. et al., The DARPA High-Performance Knowledge Bases Project, AI Magazine, Winter 1998, pp. 2549; and Burke, M., Rapid Knowledge Formation, briefing,
DARPA Information Technology Office, 2000.
[2]
This figure is adapted from Burke, M., Rapid Knowledge Formation, briefing, DARPA
Information Technology Office, 2000.
[3]
Section 8.2 and subsections are adapted from material in the authors text, Information
Warfare Principles and Operations, Norwood, MA: Artech House, 1998.
[4]
Definition from the article written by Buede, D., and E. Waltz, Data Fusion, in
McGraw Hill Encyclopedia of Science and Technology, New York: McGraw Hill, 1998. An
295
expanded definition provided by the U.S. DoD JDL is: A process dealing with the association, correlation, and combination of data and information from single and multiple
sources to achieve refined position and identity estimates, and complete and timely assessments of situations and threats, and their significance. The process is characterized by continuous refinements of its estimates, and by evaluation of the need for additional sources,
or modification of the process itself to achieve improved results.
[5] Waltz, E. L., and D. M. Buede, Data Fusion and Decision Support for Command and
Control, IEEE Trans. on Systems, Man and Cybernetics, Vol. SMC-16, No. 6, November
December 1986.
[6] It is important to note that this basic model is used for illustrative purposes; real-world
adversarial systems are complex and not so easily represented by rigid doctrinal and hierarchical models. The reader is referred to Section 8.3, where simulation approaches are
applied to complex situations.
[7] For an introduction to EBO and intelligence implications, see Davis, P. K., Effects Based
Operations: A Grand Challenge for the Analytical Community, Santa Monica, CA: RAND,
2001, accessed on-line on October 30, 2002 at http://www.rand.org/publications/MR/
MR1477. See also Christian, M. C., and J. E. Dillard, Why We Need a National Joint
Targeting Center, Air and Space Power Chronicles, Jan 2000, accessed on-line on October
30, 2002 at http://www.airpower.maxwell.af.mil/airchronicles/cc/Dillard.html. In the
wake of Operation ALLIED FORCE, the United Kingdom has also analyzed the need for
integrated targeting and has established a Directorate of Targeting and Information
Operations. See Defence Second Report on the Lessons of Kosovo (HC 347-I), U.K. House of
Commons, Select Committee on Defence, January 24, 2001, ANNEX.
[8] The evolution of the JDL model is articulated in a series of documents: White Jr., F. E.,
Data Fusion Lexicon, Joint Directors of Laboratories, Technical Panel for C3, Data Fusion
SubPanel, Naval Ocean System Center, San Diego, CA, 1987; White Jr., F. E., A Model
for Data Fusion, Proc. First National Symp. on Sensor Fusion, Vol. 2, 1988; and Steinberg,
A. N., C. L. Bowman, and White Jr., F. E., Revisions to the JDL Data Fusion Model,
Proc. of Third NATO/IRIS Conf., Quebec City, Canada, 1998.
[9] Waltz, E. L., and J. Llinas, Multisensor Data Fusion, Norwood, MA: Artech House, 1990.
[10] Hall, D., Mathematical Techniques in Data Fusion, Norwood, MA: Artech House, 1992.
[11] Antony, R., Principles of Data Fusion Automation, Norwood MA: Artech House, 1995.
[12] Hall, D. L., and J. Llinas, Handbook of Multisensor Data Fusion, Boca Raton: CRC Press,
2001.
[13] Piatetsky-Shapiro, G., and W. J. Frawley (eds.), Knowledge Discovery in Databases, Menlo
Park, CA: AAAI Press/MIT Press, 1991, p. 3.
[14] This example is very specific for purposes of illustration, however, mining criteria can be
much more general in nature (yielding more possible relationships.)
[15] An Overview of Data Mining at Dun & Bradstreet, DIG white paper 95/01, Pilot Software,
Cambridge, MA, September 1995.
[16] Chapman, P., et al., CRISP-DM 1.0: Step-by-Step Data Mining Guide, CRISP-DM
Consortium, 2000, accessed on-line on October 30, 2002 at http://www.spss.com/
CRISPDM.
296
[17] Mattison, R., and R. M. Mattison, Data Warehousing and Data Mining for Telecommunications, Norwood, MA: Artech House, 1997.
[18] Gardner, C., IBM Data Mining Technology, IBM Corporation, April 11, 1996.
[19] Fayyad, U. M., et al. (eds.), Advances in Knowledge Discovery and Data Mining, Cambridge, MA: MIT Press, 1996.
[20] Wiederhold, G., Mediators in the Architecture of Future Information Systems, IEEE
Computer, March 1992, pp. 3849.
[21] Waltz, E. L., Information Understanding: Integrating Data Fusion and Data Mining
Processes, Proc. of IEEE International Symposium on Circuits and Systems, Monterey CA,
May 31June 4, 1997.
[22] Davis, J., Combatting Mind Set, Studies in Intelligence, Vol. 36, No. 5, 1992, p. 35.
[23] Definitions from the Defense Modeling and Simulation Office Modeling & Simulation
Glossary.
[24] The Military Operations Research Society has studied the specific role of modeling and
simulation for military operations analysis. See Clements, D., and S. Iwanski, Evolving
Principles of Operations Analysis in DoD Workshop, February 29March 2, 2000, Naval
Postgraduate School, Monterey, CA.
[25] Schrage, M., Serious Play: How the Worlds Best Companies Simulate to Innovate, Boston:
HBR Press, 2001, p. 117.
[26] In Chapter 5 (Section 5.5), we introduced these three domains as the fundamental three
representations of reality, based on historical philosophical thought. Some have proposed
intelligence models with more than four domains (e.g., physical, information, political,
cultural, financial, and legal), though all can be organized under the fundamental three
categories. See Smith, R., Counter Terrorism Simulation: A New Breed of Federation,
Proc. Spring 2002 Simulation Interoperability Workshop, March 2002.
[27] See a description of a coupled three-domain architecture in the authors paper: Waltz, E.,
Data Fusion in Offensive and Defensive Information Operations, Proc. of National Symposium of Sensor and Data Fusion, San Antonio TX, June 2000, pp. 219232.
[28] The MOVES Institute Brochure, Naval Postgraduate School, 2002, accessed on-line on
September 27, 2002, at http://www.movesinstitute.org/MOVESbrochure8-02.pdf.
[29] Protecting Americas Freedom in the Information Age: A Report of the Markle Foundation Task Force, report of the Working Group on Analytic Methods, The Markle Foundation, New York, October 2002,p. 144.
[30] McCarthy, M., The National Warning System: Striving for an Elusive Goal, Defense
Intelligence Journal, Vol. 3, 1994, p. 9.
[31] Davies, J., and T. Gurr (eds.), Preventative Measures: Building Risk Assessment and Crisis
early Warning Systems, Lanham, MD: Rowman & Littelfield, 1998.
[32] See Bremer, S., Simulated Worlds: A Computer Model of National Decision Making, Princeton, NJ: Princeton University Press, 1977; Czerwinski, T. (ed.), Coping With the Bounds:
Speculations on Nonlinearity in Military Affairs, Washington D.C.: NDU Press, 1997;
297
Alberts, D. S., and T. Czerwinski (eds.), Complexity, Global Politics and National Security,
Washington D.C.: NDU Press, 1998.
[33] Remarks of DCI George J. Tenet, Opening Remarks, The Conference on CIAs Analysis of the Soviet Union, 19471991, Princeton University, March 8, 2001.
[34] For an introduction to these topics, see Robinson, C., Dynamical Systems, Boca Raton:
CRC Press, 1995; and Axelrod, R., The Complexity of Cooperation, Princeton, NJ: Princeton University Press, 1997.
[35] Axelrod, R., and M. Cohen, A Complex Adaptive Systems Approach to Information Policy,
Report Sponsored by OASD for C3I, June 8, 1997.
[36] Hunt, C. L. T. C., and I. Sais, Complexity-Based Modeling and Simulation: Modeling
Innovation at the Edge of Chaos, Proc. of 1998 Command and Control Research and Technology Symposium, CCRP, Monterey CA, June 29July 1, 1998.
[37] The MOVES Institute Brochure, p. 11.
[38] Bonabeau, E., Predicting the Unpredictable, Harvard Business Review, March 2002,
pp. 109116.
[39] For an excellent cautionary article on the use of models and simulations, see the classic
article by Sterman, J. D., A Skeptics Guide to Computer Models, MIT Sloan School of
Management, accessed on-line on September 27, 2002 at http://web.mit.edu/jsterman/
www/Skeptics_Guide.html.
9
The Intelligence Enterprise Architecture
The processing, analysis, and production components of intelligence operations are implemented by enterprisescomplex networks of people and their
business processes, integrated information and communication systems and
technology components organized around the intelligence mission. As we have
emphasized throughout this text, an effective intelligence enterprise requires
more than just these components; the people require a collaborative culture,
integrated electronic networks require content and contextual compatibility,
and the implementing components must constantly adapt to technology trends to remain competitive. The effective implementation of KM
in such enterprises requires a comprehensive requirements analysis and
enterprise design (synthesis) approach to translate high-level mission statements into detailed business processes, networked systems, and technology
implementations.
The central conceptual property of an enterprise is called its architecture.
The means of representing the architecture is by architectural descriptions
the explicit representations of the enterprises organizational structure of
components and their relationships [1]. Because the enterprise includes
people, systems, and technology components, its architecture descriptions
must analyze and describe the many complicated facets and interaction of these human, nformation, and physically implemented technology
components.
In this chapter, we introduce the structured methodologies to perform the
translation from mission objectives to the enterprise architecture descriptions
that enable the implementation of KM in real enterprises.
299
300
301
Organization
Organization Communitywide
Community
applications
shared
shared data
applications
Certificate authority
Private
data
Firewall
Intelligence
cell SCI
enclave Firewall
Community
SCI-level
collaboration
space
Controlled
interface
Collateral
secret
collaboration
space
Org
Local
directory
directory Org
Community
Collaboration storage
shared
Compartmented
wide full
data
Private key infrastructure
SCI spaces
service
certificate authority
directory
Local
applications
302
303
Language (UML), or functional decomposition using Integrated Definition Models (IDEF) explicitly describe entities and attributes, data,
functions, and relationships. These methods also support enterprise
functional simulation at the owner and designer level to permit evaluation of expected enterprise performance.
Detailed functional standards (e.g., IEEE and DoD standards specifica-
The second descriptive methodology is the U.S. DoD Architecture Framework (formally the C4ISR Architecture Framework), which defines three interrelated perspectives or architectural views, each with a number of defined
products [7]. The three interrelated views (Figure 9.2) are as follows:
1. Operational architecture is a description (often graphical) of the operational elements, intelligence business processes, assigned tasks, workflows, and information flows required to accomplish or support the
intelligence function. It defines the type of information, the frequency
of exchange, and what tasks are supported by these information
exchanges.
2. Systems architecture is a description, including graphics, of the systems
and interconnections providing for or supporting intelligence functions. The system architecture defines the physical connection, location, and identification of the key nodes, circuits, networks, and users
and specifies system and component performance parameters. It is
constructed to satisfy operational architecture requirements per standards defined in the technical architecture. This architecture view
shows how multiple systems within a subject area link and interoperate
and may describe the internal construction or operations of particular
systems within the architecture.
3. Technical architecture is a minimal set of rules governing the arrangement, interaction, and interdependence of the parts or elements
whose purpose is to ensure that a conformant system satisfies a specified set of requirements. The technical architecture identifies the services, interfaces, standards, and their relationships. It provides the
technical guidelines for implementation of systems upon which engineering specifications are based, common building blocks are built,
and product lines are developed.
304
The three views are fully interrelated, although they may also be viewed in
layers to distinguish the elements, models, and metrics that comprise enterprise
(Figure 9.2). Specific DoD architecture products are defined for each of the three
views (Table 9.1) and for a comprehensive (all-view) top-level description that is
similar to the owner row of the Zachman framework. It is important to note
that the CIOs of U.S. federal organizations have also defined The Federal Enterprise Architecture Framework suitable for non-DoD organizations to organize
and manage the development and maintenance of architecture descriptions [8].
The DoD framework products have been mapped into the Zachman
framework by Sewell to illustrate the similarity and compatibility between the
two approaches [9]. Both approaches provide a framework to decompose the
enterprise into a comprehensive set of perspectives that must be defined before
building; following either approach introduces the necessary discipline to structure the enterprise architecture design process.
Quantitative intellectual
value chain
Operational
Enterprise comprised of
architecture
virtual, dynamic
learning workgroups
Workflows
Value
chain
Work
flow
Virtual
WG
Virtual Virtual
WG
WG
Collaborative computing
Distributed
appl. distributed objects,
applications
agents
Distributed computing
System
architecture network and facilities
for mobile processes Distributed Distributed, Replicated
Topology of networks data
Knowledge Mgmt
Network
technologies
Technologies
ATM switches
Technical
architecture
Physical technical
network
Technology
components and
software services
LAN switches
Networks
Metrics
Return on info
MOE
effectiveness
Enterprise elements
MOP
performance
Architecture
view
Bandwidth,
capacity,
storage
305
Table 9.1
Major DoD Architecture Framework Products
View
All
Operational
System
Technical
Ref
Product
Description
AV-1
AV-2
Integrated dictionary
OV-1
High-level operational
concept graphic
OV-2
OV-3
Operational information
exchange matrix
OV-6a
OV-6b
OV-6c
OV-7
SV-1
SV-2
Communication description
SV-3
Systems matrix
System-to-system relationships
SV-4
Functional description
SV-10c
Event/trace model
SV-11
TV-1
TV-2
Standards technology
forecast
From: [10].
306
which focuses on optimization, completeness, and a build-from-scratch originality [11]. Enterprise (or system) architecting recognizes that most enterprises
will be constructed from a combination of existing and new integrating
components:
Policies, based on the enterprise strategic vision;
People, including current cultures that must change to adopt new and
the rapidly changing competitive landscape in its high-tech news business [13]. The emergence of new competitive threats from cable and
traditional news networks that broadcast business and technology segments are capturing increasing shares of FaxTechs market share. The
FaxTech board has issued an urgent requirement for the rapid
307
enterprise architect considers the multiple views of the Zachman architecture framework and creates a custom matrix of views appropriate for
the small CI enterprise (Figure 9.3). The appropriate (generally not all)
products to describe the enterprise are identified (unnecessary products
are blank in the figure). Each cell of the matrix contains the title of the
specific product(s) to be produced for the view that addresses a focus
(column) and stakeholder (row). The matrix identifies 10 numbered
cells that are provided (at the top level only) in the following subsections to illustrate the design approach. The arrows in the matrix also
illustrate the top-down design flow from the high-level planners viewpoint (FaxTechs top management) to the owners viewpoint (the director of the CI unit), and on to the designer, builder, and subcontractor
viewpoints.
308
Descriptive
focus:
Stakeholder
view:
What
(data)
How
(function)
Where
(network)
Who
(people)
When
(time)
Why
(purpose)
9.3.1 (1a)
value
proposition
Scope
Contextual
planners
view
9.3.2 (2)
CI business
process
Project
milestones
Enterprise
model
Conceptual
owners view
9.3.3 (3)
Business
functions
CI
(1b)
9.3.4 (4)
enterprise
Enterprise
the CI
spiral develop
requirements
organization
schedule
System
model
Logical
designers
view
Technology
model
Physical
builders
view
Detailed
models
Out of
context
contractor
view
9.3.5 (7)
Use
cases;
scenarios
Database
schema
Analyst
computer
interfaces
Database
data I/O
designs
Software
build/buy
component
requirements
Staff plan
op sec plan
op policy
and
procedures
Network
design
compsec
design
The CI value proposition must define the value of competitive intelligence, how
it will impact FaxTech, who will use the intelligence products, and how the
intelligence will be valued. The mission statement succinctly defines the CI
mission:
The competitive intelligence unit provides FaxTech executive management
timely and accurate assessments of competitors activities and market positions, using ethical means, to permit an assessment of FaxTechs relative
309
The value added by the CI unit must be explicitly defined and top-level
performance goals identified for management by further decomposing the mission statement into the elements of value using the balanced scorecard. The
scorecard method, introduced in Chapter 3, Section 3.5, identifies specific
measures in each of the areas of the scorecard. This value proposition for CI
must include three components: the delivery of reliable and relevant information, interpretation of findings to impact strategy, and creation of special insight
to support critical decisions [15]. Competitive intelligence measures include
both hard quantitative metrics of intelligence timeliness and accuracy and soft
subjective measures of the units organizational performance [16].
The quantitative measures may be difficult to define; the financial return
on CI investment measure, for example, requires a careful consideration of how
the derived intelligence couples with strategy and impacts revenue gains. Kilmetz and Bridge define a top-level measure of CI return on investment (ROI)
metric that considers the time frame of the payback period (t, usually updated
quarterly and accumulated to measure the long-term return on strategic decisions) and applies the traditional ROI formula, which subtracts the cost of the
CI investment (C CI+I,, the initial implementation cost, plus accumulating quarterly operations costs using net present values) from the revenue gain [17]:
ROI CI = [(P Q ) C CI + I ]t
The expected revenue gain is estimated by the increase in sales (units sold,
Q, multiplied by price, P, in this case) that are attributable to CI-induced decisions. Of course, the difficulty in defining such quantities is the issue of assuring that the gains are uniquely attributable to decisions possible only by CI
information [18].
In building the scorecard, the enterprise architect should seek the lessons
learned from others, using sources such as the Society for Competitive Intelligence Professionals or the American Productivity and Quality Center [19].
These sources can also provide peers by which FaxTech may benchmark its
processes and expected values. The scorecard (Table 9.2) provides the highest
level definition of expected values and measures for the CI unit to guide its
operations and measure its value-added contribution to FaxTech management.
9.3.2
310
Scorecard
Area
System Measures
Outcome Measures
ROICI
Timeliness of alerts
Reporting accuracy
Shareholder
(Financial)
Customer
(Management)
Learning
and Growth
Response time
311
1. Planning and direction. The cycle begins with the specific identification of management needs for competitive intelligence. Management
defines the specific categories of competitors (companies, alliances)
and threats (new products or services, mergers, market shifts, technology discontinuities) for focus and the specific issues to be addressed.
The priorities of intelligence needed, routine reporting expectations,
and schedules for team reporting enables the CI unit manager to plan
specific tasks for analysts, establish collection and reporting schedules,
and direct day-to-day operations.
2. Published source collection. The collection of articles, reports, and financial data from open sources (Internet, news feeds, clipping services,
commercial content providers) includes both manual searches by analysts and active, automated searches by software agents that explore
(crawl) the networks and cue analysts to rank-ordered findings. This
collection provides broad, background knowledge of CI targets; the
results of these searches provide cues to support deeper, more focused
primary source collection.
3. Primary source collection. The primary sources of deep competitor
information are humans with expert knowledge; ethical collection
process includes the identification, contact, and interview of these
individuals. Such collections range from phone interviews, formal
meetings, and consulting assignments to brief discussions with competitor sales representatives at trade shows. The results of all primary
collections are recorded on standard format reports (date, source,
qualifications, response to task requirement, results, further sources
suggested, references learned) for subsequent analysis.
4. Analysis and production. Once indexed and organized, the corpus of
data is analyzed to answer the questions posed by the initial tasks. Collected information is placed in a framework that includes organizational, financial, and product-service models that allow analysts to
estimate the performance and operations of the competitor and predict
likely strategies and planned activities. This process relies on a synoptic
view of the organized information, experience, and judgment. SMEs
may be called in from within FaxTech or from the outside (consultants) to support the analysis of data and synthesis of models.
5. Reporting. Once approved by the CI unit manager, these quantitative
models and more qualitative estimative judgments of competitor
strategies are published for presentation in a secure portal or for formal presentation to management. As result of this reporting, management provides further refining direction and the cycle repeats.
312
This high-level description can be further detailed in a high-level operational concept graphic (the DoD OV-1 product), which illustrates the component parts that accomplish the CI cell operations (Figure 9.4). This simple
model distinguishes the interfaces to management (at the top), the delivery of
intelligence products vial a portal (on the right), and the sources of data (on the
left) centered around a knowledge base in the center. The knowledge base is partitioned into three sections. The requirements section holds specific needs and
the status of accumulating knowledge against those needs, the holdings section
accumulates collected materials, and the production section maintains produced
models and completed reports.
The annotated flows in the graphic show the major flow paths that lead
from planning and direction to collection, analysis, and reporting. This highlevel operational graphic (DoD OV-2 format) provides a general description of
the flows of data without distinguishing the controlling activities that translate
the intelligence cycle into an efficient workflow.
9.3.3
The owners view of the business process describes the CI process in greater
detail, distinguishing the flow of data between the basic functions of the cycle,
controls to those functions, and the mechanisms (means) that provide resources.
Of course, this cyclic model is the most abstract representation of the overall CI
process; in reality, the CI process is an intelligence continuum of these processes,
Paid
sources
LexisNexis
Trade journals
Special
collects
Contacts
Visits
Trade shows
OV-2
Clip
Paper Abstract
literature
Collector reports
and e-mails
CKO
Internet
CI Team
Full and abstracted
reports
Metadata tagged
Formal Reports
Alerts, Updates
Informal review
Thursday lunches
Rqmts
Leadership
Current CI news
Competitor analyses
Status
Production Operation cost
and price models
Textual analyses
Analysis
Tasking
Consultant
313
314
CI
Requirements
Ethics
policy
Analytic
Ethics policy procedures
Collection procedures
Budget,
schedule
1. Plan
& direct
Portal policy
Reporting
templates
Rqmts
Current
status
2. Collect Data
published Cue
sources
Rqmts
3. Collect
Data
primary
sources
Cue
Current
results
4.
Analyze
& produce
Retasking
CI
reports
5.
Report
Current status
Mechanisms:
CI budget
Intel
components
Analytic resources
Available sources Consultant budget
(a)
Knowledge map
(CI ontology)
Budget,
schedule
Reporting
templates
Analytic
procedures
Data 1
Data 2 4.1
Index,
filter, fuse
Update
K-map
Indexed
data
4.2
Maintain
knowledge
base
Retrieved
Data
4.3
Analyze
and
synthesize
Cue
Search,
mine
Models
Status
Current results
Intel components
Retasking (feedback)
4.4
Model
and
simulate
Analytic staff
(b)
Figure 9.5 (a) CI business process functional flow, and (b) analyze and produce process
functional flow.
315
The FaxTech CI unit must monitor a one-billion-dollar market with five major
competitors and over a dozen related market players that can have a significant
influence on the marketplace. This scale of study influences the intelligence
breadth and volume to be studied, the size, structure, and composition of the CI
organization (operational architecture), and the supporting system architecture.
Of course, the complexity and dynamics of FaxTechs rapidly changing business
also influence the rate and volume of raw data to be analyzed and the depth of
analysis required (e.g., financial simulations). In turn, these factors define the
required budget to support the enterprise installation and annual support.
The FaxTech CI unit (Figure 9.6) includes a team of five full-time individuals led by a senior manager with intelligence analysis experience. This manager accepts tasking from executive management, issues detailed tasks to the
analytic team, and then reviews and approves results before release to management. The manager also manages the budget, secures consultants for collection
or analysis support, manages special collections, and coordinates team training
and special briefings by SMEs. The deputy and two competitor analysts perform
the day-to-day term analysis of the competitor set and special analyses either
FaxTech
FaxTech
CIO
Vision, strategy, budget, milestones
Interface to CIO; approves products
CI unit
Competitive status expert
leader
Coordinates consultants
Leads analysis and production
CI deputy
Manages workflow
lead analyst
Domain
expert (1/2)
Special
collection
assignments
Network
support (1/2)
Competitor
analyst 2
KnowledgeKnowledge
base manager
Competitor
analyst 1
Maintains
knowledge map,
database, and
hardcopy libraries
Orders open
source services
Tech standards
Network engineering,
management,
maintenance,
security, and backup
316
For each of the five processes, a number of use cases may be developed to
describe specific actions that actors (CI team members or system components)
perform to complete the process. In object-oriented design processes, the development of such use cases drives the design process by first describing the many
ways in which actors interact to perform the business process [22]. A scenario or
process thread provides a view of one completed sequence through a single or
numerous use case(s) to complete an enterprise task. A typical crisis response
scenario is summarized in Table 9.3 to illustrate the sequence of interactions
between the actors (management, CI manager, deputy, knowledge-base manager and analysts, system, portal, and sources) to complete a quick response
thread. The scenario can be further modeled by an activity diagram [23] that
models the behavior between objects.
The development of the operational scenario also raises nonfunctional performance issues that are identified and defined, generally in parametric terms,
for example:
Rate and volume of data ingested daily;
Total storage capacity of the on-line and offline archived holdings;
Access time for on-line and off-line holdings;
Number of concurrent analysts, searches, and portal users;
Information assurance requirements (access, confidentiality, and attack
rejection).
317
Table 9.3
A Crisis Response Scenario
1. Scenario: Crisis response
2. Summary: The CI team responds to an urgent request from management for a 72-hour turnaround
to analyze the potential for a merger between two competitors.
3. Preconditions
Competitors and consultants are identified in
the current knowledge base
4. Postconditions (results)
Status reported daily
Final report delivered at 72 hours
4. Basic flow
1. Management issues a CI crisis response memo to the CI manager, identifying specific intelligence
needs, priorities, and response time.
2. CI manager enters the requirement template and translates the memo to specific direction:
2.1 CI manager identifies the target competitors of interest.
2.2 CI manager checks specific key information needs on requirements template.
2.3 CI manager prioritizes the needs.
2.4 CI manager establishes security requirements, schedule, and reporting-dissemination
requirements.
2.5 CI manager approves requirement.
3. CI deputy creates tasking plan to accomplish the requirement direction.
3.1 Deputy opens tasking template, plans activities, and allocates task assignments to team
members.
3.2 System issues tasking assignments to the collection and analysis team.
3.3 Deputy creates special collect request form; system issues request for immediate consultant
services.
4. Knowledge-base manager sets published data collection parameters.
4.1 Knowledge-base manager translates CI requirements to sources, search, and reply format
parameters.
4.2 Knowledge-base manager sets source, search, and format parameters and issues search orders
to commercial content providers.
4.3 Knowledge-base manager creates crisis holdings partition in knowledge base.
4.4 System searches sources and identifies existing relevant holdings and creates abstracts and
links within partition.
4.5 System populates partition with accumulating data, creates abstracts, and ranks data against
requirements.
4.6 System maintains crisis holdings summary metrics.
5. Consultant issues primary source reports.
5.1 Consultant reviews special collect requirements from the field via the portal.
318
9.3.6
CI System Abstraction
The purpose of use cases and narrative scenarios is to capture enterprise behavior
and then to identify the classes of object-oriented design. The italicized text in
the scenario identifies the actors, and the remaining nouns are candidates for
objects (instantiated software classes). From these use cases, software designers
can identify the objects of design, their attributes, and interactions. Based upon
319
the use cases, object-oriented design proceeds to develop sequence diagrams that
model messages passing between objects, state diagrams that model the dynamic
behavior within each object, and object diagrams that model the static description of objects. The object encapsulates state attributes and provides services to
manipulate the internal attributes.
Based on the scenario of the last section, the enterprise designer defines the
class diagram (Figure 9.7) that relates objects that accept the input CI requirements through the entire CI process to a summary of finished intelligence. This
diagram does not include all objects; the objects presented illustrate those that
acquire data related to specific competitors, and these objects are only a subset of
the classes required to meet the full enterprise requirements defined earlier. (The
objects in this are included in the analysis package described in the next section.)
The requirement object accepts new CI requirements for a defined competitor;
requirements are specified in terms of essential elements of information (EEI),
financial data, SWOT characteristics, and organization structure. In this object,
key intelligence topics may be selected from predefined templates to specify specific intelligence requirements for a competitor or for a marketplace event [24].
The analyst translates the requirements to tasks in the task object; the task object
generates search and collect objects that specify the terms for automated search
and human collection from primary sources, respectively. The results of these
activities generate data objects that organize and present accumulated evidence
that is related to the corresponding search and collect objects.
The analyst reviews the acquired data, creating text reports and completing analysis templates (SWOT, EEI, financial) in the analysis object. Analysis
entries are linked to the appropriate competitor in the competitor list and to the
supporting evidence in data objects. As results are accumulated in the templates,
the status (e.g., percentage of required information in template completed) is
computed and reported by the status object. Summary of current intelligence
and status are rolled up in the summary object, which may be used to drive the
CI portal.
9.3.7
The abstractions that describe functions and data form the basis for partitioning
packages of software services and the system hardware configuration. The system architecture description includes a network hardware view (Figure 9.8, top)
and a comparable view of the packaged software objects (Figure 9.8, bottom).
The figure illustrates the allocation of packages to servers on the network.
The network view is organized in an n-tier architecture (n = 4), partitioning security, data storage, CI business logic, and client workstation functions.
The CI unit operates as a protected enclave within the larger FaxTech organization; the security zone includes the interface to the Internet, dedicated lines, and
320
Status
Schedule
Pct_Complete
Est_Complete
Analysis_Link
Issues_Text
Competitor_ID
Rqmt_Text
Rqmt_Priority
EEI_Rqmts
Org_Rqmts
Financial_Rqmts
SWOT_Rqmts
Other_Rqmts
CompanyList
Generates
Generates
Search
Request_ID
Competitor_ID
Topic_Category
Topic_Context
Topic_Keywords
Topic_Text
Search_Param1
Search_Param2
Generates
Analytic
process
Determines
Is reported in
Task
Task_ID
Competitor_ID
Task_Text
Due_Date
Est_Budget
Source_List
Task_Priority
Assign_Team
Data
Data_Item_ID
Data_Source
Data_Date
Data_Pedigree
Competitor_ID
Data_Abstract
Data_Text
Request_ID
Analysis
Competitor_ID
Timeline_Analysis
SWOT_Analysis
Fin_Analysis
Other_Analysis
Involves
Generates
Competitor_ID
Competitor_Name
Competiror_Text
Competitor_Data
SEC_Code
Collect
Request_ID
Collector_ID
Request_Text
Collect_Range
Collect_Class
Generates
Is linked to
Is linked to
CI summary
the FaxTech Intranet. A firewall and separate intrusion detection unit separates
the three tiers of the enclave from the Internet and the local intranet, providing
enclave boundary control. Not shown are other security services (e.g., public key
infrastructure and encryption services) to provide:
321
Firewall
Data tier
OCR
paper
documents
Client tier
Search and
extraction
server
Analytic
staff
CI Enclave
Internet
Internet
Intrusion
detection
Knowledge
base server
Portal
server
Analysis
server
Mailcollab
Search
Security
Client
Knowledge
base
Analysis
Portal
Figure 9.8 System network architecture (top) and package interfaces (bottom).
322
packages, the knowledge base, and client objects. These middle-tier business
objects are implemented by a combination of commercial applications and custom software components to provide portal, mail and collaboration, search, and
special analytic services. The client tier includes the package of common objects
(e.g., browser, mail, collaboration, and office suite) at each analysts workstation. The package view (Figure 9.8, bottom) describes the partitioning of the
objects into packages and their implementation on hardware servers.
The enterprise technical architecture is described by the standards for
commercial and custom software packages (e.g., the commercial and developed
software components with versions, as illustrated in Table 9.4) to meet the
requirements developed in system model row of the matrix. Fuld & Company
has published periodic reviews of software tools to support the CI process; these
reviews provide a helpful evaluation of available commercial packages to support
the CI enterprise [25]. The technical architecture is also described by the standards imposed on the implementing componentsboth software and hardware. These standards include general implementation standards [e.g.,
American National Standards Institute (ANSI), International Standards
Organization (ISO), and Institute of Electrical and Electronics Engineers
(IEEE)] and federal standards regulating workplace environments and protocols. The applicable standards are listed to identify applicability to various functions within the enterprise.
A technology roadmap should also be developed to project future
transitions as new components are scheduled to be integrated and old
components are retired. It is particularly important to plan for integration of
new software releases and products to assure sustained functionality and
compatibility across the enterprise.
9.4 Summary
Architecture frameworks provide a structured modeling methodology to guide
the design of a complete enterprisetaking into account the operations, systems, and technologies necessary to meet the enterprise mission and deliver on
the value proposition. The simple example in this chapter has illustrated the
approach to develop a comprehensive set of descriptive products for even a small
competitive intelligence enterprise. Larger scale enterprises and those with existing legacy systems (like the U.S. IC and large business KM enterprises) require
significant rigor in applying these methodologies to comprehensively model
their enterprises to achieve increased intelligence valuebut the principles illustrated in this example cell are applicable across the larger enterprise.
The extension of KM principles from the CI project across the entire FaxTech organization requires one last componenta means to communicate
323
Table 9.4
Representative Technical Architecture Components
Software
Component
Security
Knowledge base
Relational database
Define CI process, audit workflow, and produce status reports
XML template creation, configuration management
Metadata taggingmanual summarization
Web crawler, search agents, text search by key word and context, text
summarization
Image search, video search, audio search
Multilingual search and text translation
Analysis
Portal
Client
Browser
Mail and collaboration clients
Office tool suite
Analytic tool clients
324
Everyone was very open with me and shared their frustrations; they also shared
their lists of customers and showed me our competitors products. I organized
that data in our warehouse and created a simple portal to share the data with our
marketing and sales teams. Within 2 months, I was surprised to see 65 employees conversing on the portal about this forgotten market and our competitors
products. I organized their suggestions in a lessons-learned base. The sales force
logged into the portal and gained new insight and the marketing team developed a new TechAlert news product tailored to research and development labs.
Now, a year later, we have over 100 research and development lab accounts. The
sales team even included me on their team to receive the industry new product
awardand I didnt even know what an research and development lab was
when I started here a year ago.Jenny Crenshaw, Competitive Analyst, Corporate CI Unit
Endnotes
[1]
The definition of architecture by the International Council on Systems Engineering System Architecture Working Group is: The fundamental and unifying system structure
defined in terms of system elements, interfaces, processes, constraints and behaviors. The
more compact IEEE 610.2 definition is, The organizational structure of a system or
component.
[2]
Martin, F. T., Top Secret Intranet: How U.S. Intelligence Built IntelinkThe Worlds Largest, Most Secure Network, New York: Prentice Hall, 1998.
[3]
Swindle, M., Intelligence Community System for Information Sharing (ICSIS), presented at DoD Secure E-Business Summit, May 7, 2001, accessed on-line on June 20,
2002 at http://www.secure-biz.net/OSD2001/presentations/Presentations/T06_Swindle.ppt. See also Onley, D. S., Intelligence Community Welcomes Data Sharing, Government Computer News, May 8, 2001, accessed on-line on June 28, 2002 at
http://www.gcn.com/vol1_no1/ daily-updates/4217-1.html.
[4]
After the Sept. 11, 2001, terrorist attacks in the United States, numerous media reports
focused on the capabilities of the IC to collaborate and share data. See, for example,
Myers, L., Report Rips Intelligence Priorities, NBC and MSNBC News Service, July
17, 2002, accessed on-line on August 31, 2001, at http://www.msnbc.com/news/
781308.asp; and Waller, D., Is the Intelligence Community Ready for this War? Time,
October 9, 2001, accessed on-line on September 30, 2002, at http:// www.time.com/
time/columnist/waller/article/0,9565,178770,00.html.
[5]
Verton, D., DIA Tackles Flow of Intelligence, Federal Computer Week, October 18,
1999.
[6]
See, for example, the following classic articles: Zachman, J. A., A Framework for Information Systems Architecture, IBM Systems Journal, Vol. 26, No. 3, 1987, and Sowa, J.
F., and J. A. Zachman, Extending and Formalizing the Framework for Information Systems Architecture, IBM Systems Journal, Vol. 31, No. 3, 1992.
325
[7] C4ISR Architecture Framework Version 2.0. U.S. DoD Office of the Assistant Secretary of
Defense for Command, Control, Communications, and Intelligence, Washington, DC,
November 1997. (The DoD is reviewing a version 2.1 at the time of this manuscript and
is considering revising the name to DoD Architecture Framework, or DoDAF.)
[8] Federal Enterprise Architecture Framework, The Chief Information Officers Council,
Version 1.1, September 1999.
[9] Sowell, P. K., The C4ISR Architecture Framework: History, Status, and Plans for Evolution, Proc of 5th International Command and Control Research and Technology Symposium,
June 2000; see also Sowell, P. K., Mapping the Zachman Framework to the C4ISR
Architecture Framework, MITRE, September 3, 1999.
[10] C4ISR Framework Ver 2.0, Table of Essential and Supporting Framework Products.
[11] See the authors Requirements Derivation for Data Fusion Systems, in Handbook of
Multisensor Data Fusion, Hall, D. L., and J. Llinas, Boca Raton: CRC Press, 2001,
pp. 15-115-8.
[12] See, for example, Rechtin, E., System Architecting: Creating and Building Complex Systems,
Engelwood Cliffs, NJ: Prentice Hall, 1991; Rechtin, E., and M. Maier, The Art of System
Architecting, Boca Raton: CRC Press, 1997; Maier, M. W., Architecting Principles for
Systems-of-Systems, Systems Engineering, Vol. 2, No. 1, 1999, pp. 118; and Rechtin, E.,
The Art of Systems Architecting, IEEE Spectrum, October, 1992, pp.6669.
[13] The name FaxTech has no relationship to any actual business; it has been created solely for
this example. The name has been checked against the U.S. patent and trademark base to
ensure that it is not a registered trademark.
[14] Adapted from the KM project checklist in: ODell, C. and C. J. Grayson, If Only We
Knew What We Know, NY: Free Press, 1998, pp.194195.
[15] Langabeer II, J. R., Exploring the CI Value Equation, Competitive Intelligence Review,
Vol. 10, No. 3,1999, pp. 2732.
[16] Simon, N. J., Determining Measures of Success, Competitive Intelligence, Vol. 1, No. 2,
JulySeptember 1998, pp. 4547.
[17] Kilmetz, S. D., and R. S. Bridge, Gauging the Returns on Investments in Competitive
Intelligence: A Three-Step Analysis for Executive Decision Makers, Competitive Intelligence Review, Vol. 10, No. 1, 1999, pp. 411.
[18] Davison, L., Measuring Competitive Intelligence Effectiveness: Insights from the Advertising Industry, Competitive Intelligence Review, Vol. 12, No. 4, 2001, pp.2538.
[19] The Society of Competitive Intelligence Professionals Web site is http://www.scip.org.
The American Productivity and Quality Center Web site is http://www.apqc.org.
[20] Integrated Definition for Function Modeling (IDEF0), Federal Information Processing
Standards Publication 183, NIST, December 21, 1993. This is but one of a number of
modeling standards that may be adopted.
[21] The IDEF1X companion standard to IDEF0 provides a data modeling standard for
describing relational data: Integrated Definition for Information Modeling (IDEF1X),
Federal Information Processing Standards Publication 184, NIST, December 21, 1993.
326
[22] Rosenberg, D., Use Case Driven Object Modeling with UML, Boston: Addison-Wesley,
1999.
[23] The diagrams referenced in this section are standard unified modeling language constructs. See the Object Management Group for latest UML standards, http://www.
omg.org.
[24] See an enumeration of typical key intelligence topics in Herring, J. P., Key Intelligence
Topics: A Process to Identify and Define Intelligence Needs, Competitive Intelligence
Review, Vol. 10, No. 2, 1999, pp. 414.
[25] Intelligence Software Report 2002, Fuld & Co., Cambridge, MA, 2002. This report, first
issued in 2000 and updated in 2002, identified over 40 applicable packages, then reviewed
and scored the applicability of over 10 dedicated CI packages in detail.
10
Knowledge Management Technologies
IT has enabled the growth of organizational KM in business and government; it
will continue to be the predominant influence on the progress in creating
knowledge and foreknowledge within intelligence organizations. As we have
pointed out throughout this book, the successful application of new IT to
improve organizational effectiveness is entirely dependent upon the success of
process and cultural transitions necessary to reap the benefits. Envisioning the
future of KM requires an understanding of the trends in IT, particularly in those
technologies that we distinguish here as KM technologies, which will continue
to expand human thinking, collaboration, and problem solving. As we noted in
earlier chapters, the future intelligence competitions will be for rapid knowledge
discovery and delivery. When sensors and communications become global commodities, the critical technologies for the intelligence organization will be those
that support deep and rapid analysis-synthesis. In this chapter, the key enabling
and emerging KM technologies that will enable future generations of intelligence enterprises are introduced.
10.1 Role of IT in KM
When we refer to technology, the application of science by the use of engineering
principles to solve a practical problem, it is essential that we distinguish the difference between three categories of technologies that all contribute to our ability
to create and disseminate knowledge (Table 10.1). We may view these as three
technology layers, with the basic computing materials sciences providing the
foundation technology applications for increasing complexity and scale of communications and computing. These materials technologies produce computing
327
328
Technology
Category
Description
Technology Examples
Object-relational databases
329
330
Era:
Organizing
form
and
control
Mainframe
PC
Client-server
Hierarchy
Networked
Network
Centralized
Distributed
Personal
computing
Client-server
N-tier networked
heterogeneity
Object
oriented
Component
service
Language 1GL
2GL
generation (Fortran, Cobol) (C, Basic,
Pascal)
3GL
(C++)
4GL
(Java, platform
independent)
Function Subroutines
unit
(op system)
granularity a batch run
(organizer)
process
granularity
Modules
(op system)
application
run-time
Objects
(obj brokers)
session
Object, component,
service (dynamic
agents) domain
(of services) life
System
Organizing Single
control
control
administrator
and behavior
Single user
Multiple users
organize by
collaborative
work tools
Self-organizing
by automated
agency
User-to-user
teleconference
collaboration
User-to-agent
data
collaboration
Mainframe
homogeneity
Computing
Structure
and
Organizing Structured
principle
Software
technology
Implications
for knowledge
creation
Independent analysis,
product sharing face-face,
telephone collaboration
context, and they will guide the organizing properties of the network to
optimize service delivery.
Security services will provide strong, biometric-based protection of
331
highly effective collaboration and problem solving among teams of humans and
their autonomous agents across secure networks with access to vast arrays of heterogeneous data stores of structured and unstructured data. These agents will
empower analysts to query directly for knowledge (i.e., answers, rather than
querying for data or information) across multiple languages and media. They
will also support analysts to understand the results of queriesrepresenting and
explaining multidimensional information for human understanding. This
human-agent collaboration will, however, introduce new cultural and social
issues yet to be appreciated.
Joint Services
JCS- Info superiority/net-centric war leadership
OSD/C3Iarchitectures, C4ISR systems
DISA/D2C4I & intelligence
Natl defense univNCW
DARPAinfo technology and
/ D7rqmts analysis
and
IW
academic
theory
Systems (KM) research
DMSOmodeling and simulation
Air Force
AF Office of Scientific
Researchartificial
intelligence, perception
and cognition
AF Science
Advisory Board
battlespace infosphere
AF Research Lab
Info Directorate C4ISR
(Rome), sensor
Directorate (Dayton)
AF Battle Labsinfo
Ops experiments
Army
Army Research Office
Knowledge-based
systems, sensor webs
Army Research Lab
battlefield KM
Nets and decision
making
Army Digitization Office
experiment, demo
integration
Army Space Pgm
Office
Navy/Marines
Office of Naval Research
Cognitive
augmentation, training,
intelligent systems
SPAWAR System Center
San Diego
NCW systems R&D
Naval Research Lab
Navy Ctr AI
ResearchKM and
intelligence research
NWDCNCW
experimentation
Intel Community
ARDA-Adv R&D Agency
In-Q-Telintel
community
CIA/DIAall source KM,
analysis and digital
production R&D
NRO, NSA, NIMA, CMO
Tasking, processing,
exploitation, and
dissemination R&D
332
333
Query
Multimedia
metadata
Text-numerical
analytic tools
Portals
Lesson
learned
culture
E-mail
Supply-side
emphasis
Future KM
Webbased
culture
Collaboration
workspaces
Demand-side
emphasis
Multimedia ontologies
markup languages
Networked
human-computer
team culture
Multilingual
natural language
query
334
The second generation of KM enterprise technology now being implemented emphasizes the demand-side by focusing on the delivery of products to
users. To effectively deliver accurate, timely, and relevant knowledge to users,
technologies are being developed to support four demand-side goals:
1. Complete use of all data types. IT is being applied to capture, store,
search, and manipulate multimedia information (multilingual audio,
video, and text; imagery and geospatial data). Object-relational databases and warehouses store structured and unstructured data; metadata
tagging (e.g., XML) is used to increase the content description of data
sets, while content-analysis technology is increasing the ability to
search, categorize, and tag unstructured data.
2. Organizationwide collaboration. Integrated synchronous-asynchronous
collaboration tools enable secure multiple-mode text, graphics, audiovideo conferencing, and data sharing.
3. Comprehensive analysis of all sources. Data-mining and data-fusion tools
process a wide variety of numerical and textual data sets to correlate
content, detect patterns of known behavior, and discover relationships
across diverse data types.
4. Organizationalwide tailored dissemination of knowledge. Corporate
portals and information-assurance technologies are providing secure
access to knowledge across the organization. User profiling (manually
entered by the user and automatically learned profiles derived from
user behavior) tailors the information presented to the user.
The effect of these current second-generation technologies on the culture
has been to enable a Web-based culture that accepts larger distributed communities of practice comprised of people who perform collaborative knowledge
sharing and problem solving.
Future generations of KM will emphasize the creation of knowledge by
completely integrating the supply and demand sides of the enterprise with
greater machine intelligence to support the human users of the enterprise. These
future technologies will add to the previous generations by adding:
Deep content understanding. Ontologies of general (common-sense) and
335
Future explicit knowledge combination technologies include those that transform explicit knowledge into useable forms and those that perform combination
processes to create new knowledge.
Multimedia content-context tagged knowledge bases. Knowledge-base
336
Core: Required to
Maintain and
Sustain Current
Capabilities
Enabling: Tech
Base for Next
Generation
Capabilities
Section 10.4.1
Explicit Explicit
Section 10.4.2
Explicit Tacit
Section 10.4.3
Tacit Tacit
Push/pull (subscribe/
query) dissemination
Distance e-learning
IIDR
3D and synthetic
n-dimensional
visualizationvirtual
reality
Semiautomated data
fusion and mining of
structured data
Synchronous/
asynchronous
collaboration tools
Creativity supportive
tools
Visual and strategy
simulation tools
(experience)
Tailored naturalistic
collaboration tools
Situation immersion
Integrated deductioninduction
Intimate tacit
simulations
Multilingual speech
interaction
Automated deductiveEmerging: Beyond
inductive reasoning and
Next Generation,
learning
a Departure From
Current Practices Automated ontology
creation
Purposeful, aware
agents
Human cognition
augmentation
Direct brain interaction
Pervasive personal
networked computers
Combined human-agent
learning; personal agent
tutors, mentors, and
models
Indistinguishable
human-like agent
partners, communities
of practice, and teams
Direct brain tacit
knowledge awareness,
tracking, articulation,
and capture
unstructured) with tagging of both content and context to allow comprehensive searches for knowledge across heterogeneous sources.
Multilingual natural language. Global natural language technologies
337
will also allow linkage of concepts across cultures and continents. The
technology will also allow context and content summarization, understanding, and explanation [5].
Integrated deductive-inductive reasoning. Data-fusion and-data mining
coordinate inductive (learning and generalization) and deductive (decision and detection) reasoning processes (as well as abductive explanatory reasoning) across unstructured multilingual natural language,
common sense, and structured knowledge bases. This reasoning will be
goal-directed based upon agent awareness of purpose, values, goals, and
beliefs.
Automated ontology creation. Agent-based intelligence will learn the
laborative team members, first as supporting helpers, then as intelligence participants and peers who can become expert contributors. The
agents will operate with autonomy, purpose, intelligence, and the capability to explain the rationale for their contributions to the team.
Rapid expert knowledge acquisition. Distributed, continually updating,
338
major global languages will not only enhance the capture of explicit
conversation, it will enable more natural and accurate collaboration
between analysts and foreign intelligence partners and contributors.
Purposeful, aware agents. Agent technology will provide a degree of self-
attention and focus will be used track performance and guide the flow
of information to augment cognition; these technologies will also counteract human limitations in memory, attention, learning, and sensing
(visualization) [8]. Instinctual systems will detect human sense and
autonomic reactions, as well as recognition of subliminal cognitive cues
(e.g., the unspoken huh?; hmm; oh!; and ah ha!) to cue tight
interactive analysis collaboration between human and machine.
Direct brain interaction. This is direct multidimensional presentation to
339
(wireless) computations will enable their continuous presence to monitor support and anticipate human activity. Analysis, problem solving,
and collaboration will not be constrained to the desktop.
vide environments with automated capabilities that will track the context of activities (speech, text, graphics) and manage the activity toward
defined goals. These environments will also recognize and adapt to
individual personality styles, tailoring the collaborative process (and
the mix of agents-humans) to the diversity of the human-team
composition.
Intimate tacit simulations. Simulation and game technologies will enable
models will shadow their human partners, share experiences and observations, and show what they are learning. These agents will learn monitor subtle human cues about the capture and use of tacit knowledge in
collaborative analytic processes [9].
Direct brain tacit knowledge. Direct brain biological-to-machine con-
nections will allow monitors to provide awareness, tracking, articulation, and capture of tacit experiences to augment human cognitive
performance.
340
10.5 Summary
KM technologies are built upon materials and ITs that enable the complex
social (organizational) and cognitive processes of collaborative knowledge creation and dissemination to occur over large organizations, over massive scales of
knowledge. Technologists, analysts, and developers of intelligence enterprises
must monitor these fast-paced technology developments to continually reinvent
the enterprise to remain competitive in the global competition for knowledge.
This continual reinvention process requires a wise application of technology in
three modes. The first mode is the direct adoption of technologies by upgrade
and integration of COTS and GOTS products. This process requires the continual monitoring of industry standards, technologies, and the marketplace to
project the lifecycle of products and forecast adoption transitions. The second
application mode is adaptation, in which a commercial product component may
be adapted for use by wrapping, modifying, and integrating with commercial or
custom components to achieve a desired capability. The final mode is custom
development of a technology unique to the intelligence application. Often, such
technologies may be classified to protect the unique investment in, the capability of, and in some cases even the existence of the technology.
Intelligence organizations have a mandate to remain competitive and a
mandate to leverage the significant commercial investment in information and
KM technologies. The Government Electronics and Information Technology
Association reported that in 1997 global commercial IT research investments
exceeded the entire (not just IT-related) U.S. DoD research development, test,
and evaluation budget and was increasing at a rate significantly higher than
DoD investments [10]. This commercial technology is available to all intelligence competitors and the intelligence technologist requires the wisdom to
know how and what technologies to adopt, adapt, and develop. Technology is enabling, but it is not sufficient; intelligence organizations must also have the vision
to apply these technologies while transforming the intelligence business in a rapidly changing world.
Endnotes
[1]
The Global Technology Revolution Bio/Nano/Material Trends and Their Synergies with
Information Technology by 2015, Study for the National Intelligence Council, RAND
National Defense Research Institute, 2001. In addition, the projected global societal
effects of IT were reported in Anderson, R., et al., The Global Course of the Information
Revolution: Technological Trends: Proceedings of an International Conference, RAND
CF-157-NIC, 2000. In the terminology of that report, KM capabilities would be considered artifacts or services, rather than a technology.
341
[2] The Strategic Intent is classified document, but essential KM activities are described in
unclassified documents. See IC CIO, Advancing Information Systems to Improve the Business of Intelligence, DCI/CMS/IC CIO, 2000.
[3] For example, see Enterprise Portals: Connecting People, Information, and Applications,
META Group Publications, 2000; Advanced Technologies and Applications Study, The
Gartner Group, 2001; and Corporate Portals, The Delphi Group, 2001.
[4] The reader should be aware that this future possibility is envisioned by some serious thinkers as a grave danger. In the context of this book, we deal only with the future certainty of
increasing machine intelligence and the potential benefits to knowledge creation. For the
concerns on this issue, see Kurtzweil, R., The Age of Spiritual Machines: When Computers
Exceed Human Intelligence, New York: Penguin Books, 1999; and Joy, Bill, Why the
Future Doesnt Need Us, Wired, 8.04, April 2000, accessed online on August 6, 2002, at
http://www.wired.com/wired/archive/8.04/joy.html.
[5] The DARPA Translingual Information Detection, Extraction, and Summarization Program is conducting research in relevant topic detection, extraction, and tracking to enable
English-speaking users to access, correlate, and interpret multilingual sources of real-time
information.
[6] The DARPA Evidence Extraction, Linking, and Discovery program is developing knowledge discovery technology to overcome the domain dependent limitations of current technologies to provide extraction of evidence from unstructured data, automated detection of
relevant links, and learning of a wide range of pattern types (e.g., temporal, organizational,
and transactional.)
[7] The Advanced Research and Development Agency Advanced Question and Answer Program (Acquaint) is conducting research to: question understanding and interpretation
(including contextual interpretation, query expansion, and query taxonomy), answer derivation (including information retrieval and extraction from multiple media/languages and
data types, interpretation, synthesis, resolving conflicting information, and justification),
and answer formulation and presentation (including summarization, synthesis, and
generation).
[8] The DARPA Augmented Cognition program is conducting research methods to measure
human cognitive load and capacity to optimize the flow of information to the human sensor channels in an effort to overcome cognitive limitations. The DARPA-sponsored InfoCockpit research conducted by Carnegie-Melon/University of Virginia is an example of an
immersive visual environment constructed to enhance human memory of complex situations.
[9] The ARDA Novel Intelligence from Massive Data program is researching tacit knowledge
capture and use technologies.
[10] Federal Information Technology Forecast, Government Electronics and Information
Technology Association, Washington D.C., 1998.
343
Index
availability, 48
capability maturity levels of, 124
collaborative, 143, 173
collaborative tools, 44
data, 5
decision, 147
defined, 160, 16465
hypotheses, 226
impact refinement, 202
matrix, 225
MI objects of, 32
new, 49
situation refinement, 202
subject of, 209
user availability, 47
See also Analysis-synthesis
Analysis of competing hypotheses (ACH),
144, 22427
for collaborative analytic reasoning, 225
matrix, 255
process flow illustration, 226
structure, 225
Analysis-synthesis, 16162
collaborative, 235
distinctions, 163
intelligence consumer expectations and,
19698
with intelligence cycle, 16162
in intelligence workflow, 198202
as modeling process, 18086
345
346
Analysis-synthesis (continued)
M&S support, 290
in national decision-making chain, 165
practice of, 195235
principles of, 159
process, 16061
process description, 162
structured, stages, 22729
See also Intelligence analysis
Analyst-agent interaction, 26567
Analytic applications, 166
Analytic tools. See Cognitive services
Apprehension, 188
Archtypes, 79
Argumentation
chain of reasoning for, 214
common forms of, 21314
convergent, 213
defined, 210
divergent, 214
graph theory and, 217
linked, 213
serial, 213
space, 183
structured inferential, 21315
structuring, 20923
Toulmins scheme, 215
Asynchronous collaboration, 133
Automated combination, 27489
data fusion, 27783
data mining, 28388
defined, 27475
Automated data fusion, 243
Automated ontology creation, 337
Automated processing, 2035
implementation, 205
problems, 2045
Backward chaining, 177
Balanced scorecard (BSC), 8992
cause-and-effect model, 9192
CI unit case study, 310
defined, 89
gains, 90
valuation model, 90
views, 8991
Black propaganda, 230
Brain
direct interaction, 338
direct tacit knowledge, 339
Index
business operational concept, 312
business process, 30912
business process functional flow, 31214
business process functional flow
illustration, 314
crisis response scenario, 31718
enterprise architecture descriptive
products, 308
FaxTech, 31516, 320
mid-tier business objects, 322
operational scenario, 31618
organization and relationships, 31516
package interfaces, 321
return on investment (ROI), 309
situation, 3067
system abstraction, 31819
system class diagram, 320
system network architecture, 321
system/technical architectural
descriptions, 31922
technical architectural components, 323
value proposition, 3089
waterfall flow, 313
Civilization, waves of, 7
Codification and coordination, 71
Cognitive hierarchy, 3
Cognitive services, 245, 24956
decision support tools, 25253
defined, 249
exploration tools, 250
importance, 24950
reasoning tools, 25052
sensemaking tools, 252
summary, 25758
taxonomy, 251
tool use, 25356
Cognitive shortcomings, 2079
strategies, 209
types of, 208
vulnerabilities, 23132
Collaboration, 108
asynchronous, 133
barriers, 108
behavior levels, 130
centralized, 13637
directory, 253
distributed, 137
facilitation role in, 13133
facilitators, 133
347
human-agent, 335
modes, 134
mutual trust, 131
organizational, 12942
paths, 14042
process, 130
services, 245
social, 118
synchronous, 133
tailored naturalistic, 339
virtual, 118
Collaborative analysis tools, 44
Collaborative culture, 13133
Collaborative environments, 13337
defined, 133
group structuring, 13637
virtual, 133, 134
Collaborative information agents, 265
Collaborative intelligence workflow, 13742
collaborative paths, 14042
situation, 138
team, 13840
Collaborative processes, 109
Collaborative virtual environment (CVE),
133
Collection, 3334
categories, 36
HUMINT, 3738
intelligence, 3642
managers, 13940
planning, 4042, 161
primary source, 311
published source, 311
sources and methods, 3540
tasking, 161
technical intelligence, 3840
Collective intelligence, 129
Combination
automated, 27489
defined, 272
explicit knowledge capture and, 274
explicit to explicit, 72
intelligence use case spiral, 79
in intelligence workflow, 243
Commercial off the shelf (COTS), 256, 340
Communities of organizational practice,
11517
defined, 116
forum, 118
348
349
Index
Decision synthesis, 22627
Deduction, 16768
defined, 167
expression, 167
in intelligence example, 178
path of, 17677
propositional argument forms, 169
See also Reasoning
Defense Intelligence Journal, 20
Denial, 229
counter analysis process components, 234
countering, 22934
countermeasures, 232
counter methodology components, 233
hypotheses, 233, 234
methods, 229
operations, 231
See also Deception
Descriptive analyses, 11
Digital production services, 245
Directed graphs, 217
evidence and inference forms, 218
inference network, 221
in linking evidence, 224
Direct evidence combination, 217
Disintermediation, 94
Dissonance evidence combination, 219
Distributed collaboration, 137
Document object model (DOM), 262
Domain knowledge, 277
Dynamic knowledge, 59
Economic value added (EVA), 88
Effects-based military operations (EBO), 279
Enterprise
architectural model, 97
defined, 1
information architecture elements, 19
intelligence, 2
Enterprise architecture, 85, 299324
defined, 299
describing, 3026
design case study, 30622
integration system, 247
operations, 300301
summary, 32224
Evidence
analysis, 225
marshaling, 176, 21113
missing, 174
negative, 175
positive, 174
Evidence combination, 21719
consonance, 21719
direct, 217
dissonance, 219
redundant, 219
Explanation space, 18384
Explicit capture, 274
Explicit knowledge, 6368
basis of, 64
categorization, 67
combination technologies, 33537
conversation modes, 7173
defined, 63
knowing, 68
in physical sciences, 64
understanding, 65
See also Knowledge
Exploration tools, 250
Externalization
intelligence use case spiral, 7879
in intelligence workflow, 24243
tacit to explicit, 7172
Externalization-to-internalization workflow,
256
First generation KM, 333
Formal learning, 12729
distance learning canned (DL-canned),
128
distance learning collaborative (DLcollaborative), 128
distance learning remote (DL-remote),
12728
residential learning (RL), 127
See also Learning
Forward chaining, 177
Generation, 7071
Global sensors, 8
Government off the shelf (GOTS), 256, 340
Gray propaganda, 230
Groupware
collaborative implementations, 13637
defined, 134
functions, 13536
Harvard Business Review, 22
Holism, 147
350
Inference processes
direction of, 167
structure of, 168
See also Reasoning
Inference(s)
computation, 224
reconstructive, 234
Inferential analyses, 1112
Inferential networks, 21523
logical structure, 223
for military deception hypothesis, 221
Infomediation, 94
Informal learning, 12527
experimentation, 12526
external comparison sources, 126
internal experience, 126
knowledge transfer, 12627
systematic problem solving, 125
See also Learning
Information
defined, 3
dominance, 16
enterprise architecture elements, 19
SBU, 75
See also Data
Information, indexing, discovery, and
retrieval (IIDR), 42
Information retrieval (IR), 24749
approaches, 24748
capabilities taxonomy, 249
data query methods, 247, 248
defined, 247
text query methods, 248
Information visualization, 26465
applications, 266
methods, 265
Innovation capital, 86
Institute of Electrical and Electronics
Engineers (IEEE), 322
Intangible asset monitor (IAM), 88, 89
Integrated deductive-inductive reasoning,
337
Integrated Definition Models (IDEF), 303
Integrated reasoning process, 17580
Intellectual capital
components of, 87
defined, 85
See also Capital
Index
Intelligence
access categories, 36
applications, 1217
business (BI), 14, 17
as capital, 8593
categories, 1012
collective, 129
competitor (CI), 14, 17
cycle, 34
defined, 12, 1011, 29
descriptive analyses and, 11
disciplines, 6, 1217
discipline subjects, 14
dissemination, 35
environment, changing, 50
future of, 4850
global cooperation, 49
human (HUMINT), 6, 7, 3738, 4142
inferential analyses and, 1112
KM and, 124
measurements and signatures (MASINT),
37
methodology standards, 198
military (MI), 14, 16
national, 14, 1516
nation-state, 1213, 2933
objects, 13
open source (OSINT), 20, 35, 49
planning/direction, 33
processes, 3335
processing, 34
production, 3435
products, 11, 3335
threat-warning, 13
See also Analysis; Collection
Intelligence analysts, 164
cognitive shortcomings, addressing,
2079
role of, 2057
tacit knowledge, 205
Intelligence and National Security, 20
Intelligence business models, 9596
B2B, 95
B2C, 95
C2B, 96
C2C, 96
Intelligence business strategies, 9394
customer aggregation, 94
disintermediation, 94
351
infomediation, 94
Intelligence consumer
defined, 13839
expectations, 19698
as stakeholder, 30
See also Project team
Intelligence CRM, 99
Intelligence deception, 231
Intelligence enterprise, 1720, 2950
applications, 96102
architecture, 96102, 299324
characteristics, 9697
defined, 2
elements, 17
existence, 29
HCI view, 81
operational architecture, 18
operational view, 81
operations, 300301
systems architecture, 18
technical architecture, 1819
technical view, 81
Intelligence information markup language
(ICML), 261
Intelligence portal services, 26364
Intelligence process
analysis-synthesis balancing, 4648
analysis-synthesis focusing, 4546
assessments and reengineering, 4448
collection/analysis balance, 45
KM in, 4244
observable events, 40
phases, 43
planning, 4042
target modeling, 41
Intelligence product values, 93
Intelligence SCM, 101
Intelligence services tier, 24445
cognitive services, 245
collaboration services, 245
digital production services, 245
indexing, query, retrieval, 244
operational processing, 244
workflow management, 245
Intelligence targets
apprehension, 188
of IO, 190
predication, 188
reasoning, 188
352
Index
categorization, 67
as central asset, 56
defined, 3
detection/discovery methods, 276
domain, 277
dynamic, 59
enhancement strategy components, 61
environment, 16
explicit, 6368
flow, to action, 60
flowpaths, 113
focused creation, 60
inferred, 277
intangibles, 86
investment in, 109
as object, 6368
as process, 6871
repositories, 113
significance, 8
sources, 113
tacit, 6368
transferring, 12627
valuation of, 85
workers, 103
Knowledge, attitude, skills (KAS), 122
Knowledge and Process Management, 22
Knowledge-based organization, 10753
communities of practice, 11517
fundamental clusters, 115
KM project initiation, 11718
organizational collaboration, 12942
organizational learning, 12129
organizational problem solving, 14251
storytelling and, 11821
structures, 11215
virtues and disciplines, 10921
Knowledge-centric approach, 8
Knowledge creation, 274
enabling conditions, 74
hierarchy, 62
Knowledge creation model, 7174
explicit to explicit (combination), 72
explicit to tacit (internalization), 7273
illustrated, 72
tacit to explicit (externalization), 7172
tacit to tacit (socialization), 71
The Knowledge Creating Company, 71
Knowledge culture developers, 83
Knowledge discovery. See Data mining
353
Knowledge management (KM)
applications, 8485
defined, 1, 55
definitions, 57
disciplines, 84
elements, 56
first generation, 333
future generations of, 33435
information sources, 2223
infrastructure introduction, 10
innovation, 103
intelligence and, 124
in intelligence process, 4244
IT role in, 32731
networks, 118
organic school of, 120
perspectives, 108
projects, initiating, 11718
research in national security applications,
33132
second generation, 1023, 334
strategy basis, 57
taxonomy of, 8085
technology and tools, 85
See also KM technologies
Knowledge management processes, 12,
55104
defined, 8081
disciplines and supporting tools/
techniques, 82, 83
intelligence enterprise, 81
practical transaction, 70
Knowledge representation
defined, 213
layers, 273
Knowledge strategists, 83
Koestler
bisociation, 171
common forms of human inductive
discovery, 170
constructive-destructive process, 172
graphical representation of discovery, 171
Leadership deception, 231
Leading the Revolution, 9
Learning
defining/measuring, 12223
distance, canned (DL-canned), 128
distance, collaborative (DL-collaborative),
128
354
Learning (continued)
distance, remote (DL-remote), 12728
formal, 12729
informal, 12527
modes, 12529
organizational, 12129
residential (RL), 127
team, 122
transformational, 122
Lockwood analytic method for prediction
(LAMP), 145
Maintenance, 69
Management International Review, 21
Mapping
defined, 112
five stages of, 11214
organizational knowledge structures,
11215
process, 113
Marshaling evidence
defined, 176
hypothesis argumentation structures and,
212
M&S support, 290
structuring arguments and, 21113
See also Evidence
Matrix analysis, 225
Matrix synthesis, 225
Measurements and signatures intelligence
(MASINT), 37
Measures, 32, 33
Mental models
defined, 122
formation within intelligence workflow,
207
Military dominance, 16
Military intelligence (MI), 16
defined, 14
focus, 32
information sources, 2021
objects of analysis, 32
operational, 32
reporting cycle, 32
tactical, 32
See also Intelligence
Military Intelligence Professional Bulletin, 21
Mind
as intelligence target, 65
as knowledge manager, 65
355
Index
Object(s)
of analysis, 32
intelligence, 13
refinement, 279, 28081
Observe-orient-decide-act (OODA) loop,
199
Open source intelligence (OSINT), 20, 35,
49
Operational architecture, 18
defined, 303
illustrated, 304
products, 305
Operational processing, 244
Organization, this book, 2324
Organizational analysis, 120
Organizational collaboration, 12942
culture, 13133
environments, 13337
intelligence workflow, 13742
See also Knowledge-based organization
Organizational competencies, 11718
Organizational knowing, 70
Organizational learning, 12129
defining/measuring, 12223
knowledge maturity measurement,
12325
modes, 12529
See also Knowledge-based organization
Organizational link analysis process flow, 228
Organizational problem solving, 14251
critical, structured thinking, 14347
naturalistic decision making, 15051
problems, 14243
strategy flow, 146
systems thinking, 14750
See also Knowledge-based organization
Organizational values/virtues, 11012
People, 80, 103
disciplines and supporting tools/
techniques, 8283
intelligence enterprise, 81
investment in, 109
knowledge culture developers, 83
knowledge strategists, 83
perspective, 107
pointers to, 115
People-Capability Maturity Model
(P-CMM), 12324
Performance drivers, 91
356
357
Index
understanding, 65
See also Knowledge
Target-observer model, 278
Tasking, collection, processing, exploitation,
dissemination (TCPED), 100
Tasking, processing, exploitation,
dissemination (TPED), 200
Technical architecture, 1819
defined, 303
illustrated, 304
products, 305
Tension factor, 44
Thinking
analytic, 148
critical, structured, 14347
systems, 14750
Third wave, 6
Threat-warning intelligence, 13
Toulmins argument structure, 21516
Tradecraft Notes, 15253
Transfer, 6970, 71
Transformation, 69
Transformational learning, 122
Unified Modeling Language (UML), 303
U.S. DoD
contrast in perspectives, 59
Joint Directors of Laboratories (JDL) data
fusion model, 200, 279
KM and, 58
KM technology developers, 331
research development, test, evaluation
budget, 340
views of intelligence enterprise, 303, 304
U.S. DoD Architecture Framework, 3035
defined, 303
operational architecture, 303
products, 3045
systems architecture, 303
technical architecture, 303
view illustration, 304
views, 3034
U.S. IC, 44
chains of purpose and values for, 31
chief information officer (CIO), 263
collaboration, 13738
evolution/revolution alternatives, 49
KM technology developers, 331
metadata/metadata markup working
group, 261
Operational Network (ICON), 300, 301
stakeholders, 2930
Strategic Investment Plan for Analysis,
24950
U.S. National Security Agency (NSA)
KM definition, 58
perspectives contrast, 59
Value chain, 32, 33
Value propositions
categories, 8788
CI unit case study, 3089
customer intimacy, 88
defined, 87
operational excellence, 87
product-to-market excellence, 8788
Values
categories, 110
intelligence product, 93
organizational, 11012
service, 93
Virtual collaboration, 118
Virtual Corporation, 67
Waterfall flow, 313
Workflow management, 245
XML, 26061
predefined metadata format, 263
SIGINT content example, 261
Zachman Architecture Framework, 302
Artech House
46 Gillingham Street
Norwood, MA 02062
Phone: 781-769-9750
Fax: 781-769-6334
e-mail: [email protected]
e-mail: [email protected]