0% found this document useful (0 votes)
29 views6 pages

MOP and MOE

Uploaded by

Ismael CArdoso
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views6 pages

MOP and MOE

Uploaded by

Ismael CArdoso
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

UNCLASSIFIED

Establishing System Measures of Effectiveness


John M. Green
Senior Member
Senior Principal Systems Engineer
Raytheon Naval & Maritime Integrated Systems
San Diego, CA 92123

Abstract and the attrition of merchant shipping. The


One of the most important tasks in the systems question to be answered was what was the
development process is that of performance difference between an aggressive and defensive
analysis. It is needed to ensure that the system use of antisubmarine formations and their
meets its requirements, is delivered on schedule, component antisubmarine warfare systems? For
and developed within allocated costs. It consists of the former use, the expected number of U-boats
two phases: performance prediction and killed by an antisubmarine hunter-killer group was
performance measurement. Proper selection of the effectiveness measure. For the latter, the
performance measurement attributes is essential to effectiveness measure was the probability of
this process. These measurement attributes preventing convoy formation penetration by the U-
commonly called “measures of effectiveness” or boat. Thus, while both formations had ships as
“MOEs” provide quantifiable benchmarks against basic component systems, hunter-killer groups
which the system concept and implementation can were centered on an aircraft carrier and
be compared. Early in the life of a system, antisubmarine destroyers employing a methodical
prediction is required for feasibility and search and destroy process. The convoy was
specification development. Towards the end of comprised of merchant ships and destroyer escorts
systems implementation and development, using high-speed transit of the submarine areas as
performance measurement techniques play a major a defensive process. Thus, it can be seen from this
role in system testing and verification. Choosing example that effectiveness measures are critically
incorrect MOEs will result in a system that does important because they are the criteria that drive
not meet customer expectations. This paper the system solution that is found.
introduces a comprehensive and systematic
process by which viable MOEs that quantify and Selection of Performance Measure Attributes
analyze system performance may be developed. Oliver’s approach to bringing definition to the ill-
The approach is based on research into Command defined problem is to break the systems
and Control System evaluation performed at the engineering process into two parts: what he
Naval Postgraduate School during the late 1980’s. describes as the Systems Engineering Management
This paper extends the research to open systems in process and the Systems Engineering Technical
general and develops several of the original process. Within the Systems Engineering core
theoretical concepts in more detail. technical process he describes six steps as shown
in Figure 1. This process recognizes that an open
Introduction system interacts with the environment beyond its
The design of a system is an ill-posed problem that boundaries. Steps 2, 3, and 4 are crucial in that
has no solution without a set of criteria to guide they bound the system, capture the systems
choices [Oliver et. al., 1997]. Morse and Kimball behavior and define the effectiveness measures,
first addressed the issue of performance prediction the criteria that mean success or failure. His
and measurement in the summary of their World approach is based upon the concept that a system
War II analytic work published as “Methods of is a unified collection or set of objects that exhibit
Operations Research” [Morse and Kimball, 1970].1 a unique set of behaviors when combined together
They cite an excellent example of how the role of in their operating environment. It is very similar
effectiveness measures in systems thinking can be to the approach shown in Figure 2. This approach
used to shape systems to operate in a particular was developed by the Military Operation Research
environment. Antisubmarine warfare systems Society’s (MORS) work on Measures of
were a high priority because of the U-boat threat Effectiveness for Command and Control. It also
focuses on an early bounding of the system
1 followed by selection of performance measures
The work was originally known as Operations
[Sweet, et. al., 1985].
Evaluation Group Report (OEG) 58.
UNCLASSIFIED
1

Distribution Statement A: Approved for public release; distribution is unlimited.


Report Documentation Page
Report Date Report Type Dates Covered (from... to)
27MAR2001 N/A 27MAR2001 - 29MAR2001

Title and Subtitle Contract Number


Establishing System Measures of Effectiveness
Grant Number

Program Element Number

Author(s) Project Number


Green, John M.
Task Number

Work Unit Number

Performing Organization Name(s) and Address(es) Performing Organization Report Number


Raytheon Naval & Maritime Integrated Systems San
Diego, CA 92123

Sponsoring/Monitoring Agency Name(s) and Sponsor/Monitor’s Acronym(s)


Address(es)
OSD Pentagon Washington, DC Sponsor/Monitor’s Report Number(s)

Distribution/Availability Statement
Approved for public release, distribution unlimited

Supplementary Notes
Papers from the Proceedings AIAA 2nd Biennial National Forum on Weapon System Effectiveness, held at
the John Hopkins University/Applied Physics Laboratory, 27-29 March 2001. Controlling Agency is OSD,
Pentagon, Washington DC, per Richard Keith, Editor. See also ADM201408, non-print version (whole
conference). , The original document contains color images.

Abstract

Subject Terms

Report Classification Classification of this page


unclassified unclassified

Classification of Abstract Limitation of Abstract


unclassified SAR

Number of Pages
5
UNCLASSIFIED

system development process. There is an


The logic behind bounding the system early is expectation that there is a magic list of canned
driven by the performance measures development effectiveness measures that the Systems Engineers
process. The system boundaries define the set of can use like a lookup table in the early stages of
system parameters that drive system performance. development. Failure to understand this point can
A change in the boundaries changes the parameter have a ripple effect throughout the system
set and the resulting system behavior and its lifecycle.
performance. This is often overlooked in the

No
Feasible
Solution
2. 0
Define
Effectiveness
Measures

1. 0 3. 0 5. 0
6. 0
Assess Create Perform
Available & Behavior Trade
Create Sequential
Build & Test Plan
Information Model Studies

Feasible
Solution
4. 0
Create
Structural
Model

Figure 1. The Steps of the Core Technical Process

Definition of Effectiveness Measures Decision Formulate


Maker Problem
There are a number of terms used to describe
system performance. While several of these terms
are often used interchangeably to describe the Bound
same thing (e.g., Measures of Performance Implement
System

(MOPs) is interchanged with Measures of Results

Effectiveness (MOEs), the MORS’s work Define


recognized that there is indeed a hierarchy of Process
effective measures. MORS identified the
following key concepts: parameters, Measure(s) of Integrate
Performance, Measure(s) of Effectiveness, and System
Elements
Measure(s) of Force Effectiveness. While the later &
term is not appropriate for systems in general the Functions

idea is valid and will be addressed below.


Specify
Measures
This hierarchy follows the system of system
concept first presented by Ackoff [Ackoff, 1971].
The following definitions clarify the hierarchy. Generate
Data

Parameters: the properties or characteristics


inherent in the physical entities, whose values Aggregate
determine system behavior and the structure under Measures

question, even when not operating. Typical


examples include signal-to-noise ratio, bandwidth,
frequency, aperture dimensions, and bit error rates.
Figure 2. The MORS’s Command and Control
System Definition Process

UNCLASSIFIED
2
UNCLASSIFIED

Measures of Performance (MOP): measures determine to what extent user requirements are
derived from the dimensional parameters (both met. They are the criteria used to make the trade-
physical and structural) and measure attributes of off decisions of what to build. They are the
system behavior. MOPs quantify the set of criteria that drive the system solution that is found.
selected parameters. Examples include sensor
detection probability, sensor probability of false Effectiveness measures are derived from first
alarm, and probability of correct identification. principles. They are mission and scenario
dependent and must discriminate between choices.
Measures of Effectiveness (MOE): measure of System parameters are often mistakenly used as
how a system performs its functions within its effectiveness measures. As an example, sensor
environment. An MOE is generally an search rate has been specified as an MOP,
aggregation of MOPs. Examples include however, increasing the search rate of a sensor
survivability, probability of raid annihilation, and improves the probability of detection thus search
weapon system effectiveness. rate is a parameter.

In the MORS work the term Measures of Force Effectiveness measures must be both measurable
Effectiveness was defined as: and testable. This means that they are quantitative
Measures of Force Effectiveness (MOFE): in nature. Further, they have to realistically
measure of how a system and the force (sensors, measure the systems purpose or objective. Failure
weapons, C3 system) of which it is a part perform to do so results in a system that fails to meet its
military missions. purpose.

This last definition can be modified for the general The issue of sensitivity is important. The
systems case as follows: effectiveness measure(s) not only needs to reflect a
Measures of Systems Effectiveness (MOSE): change in the parameter set, it must also have a
measure of how a system of systems performs its reference from which the change can be evaluated.
mission. Doubling the value of a parameter does not
necessarily correspond to a doubling of the
The relationship between the various elements of effectiveness measure. Expressing MOPs, MOEs,
the hierarchy is shown in Figure 3. and MOSEs as a probability allows us to
determine if a parametric change is statistically
Environment
significant.

Finally, effectiveness measures must be


Force independent at the level of analysis under
evaluation. In other words, MOPs should be
System independent but they can be aggregated into
MOEs. The MOEs would be independent of each
other and can be aggregated into an MOSE.
Dimensional
Parameters

Table 1 summarizes the desired characteristics of


measures.
MOPs

Characteristics Definition
MOEs
Mission Oriented Relates to force/system.

MOFEs Discriminatory Identifies real difference


between alternatives.

Figure 3. The Effectiveness Measures Measurable Can be computed or


Hierarchy estimated.

Quantitative Can be assigned numbers


Features of measures or ranked.
With effectiveness measures defined and ordered it
bears repeating that they are the standards against Realistic Relates realistically to the
system and associated
which the performance of a system is compared to

UNCLASSIFIED
3
UNCLASSIFIED

uncertainties. its probability of winning. It is composed of four


major subsystems, coaching staff, offense, defense
Objective Defined or derived,
independent of subjective
and special teams. An example MOE would be
opinion. the probability of calling scoring plays (or a
sequence of plays that lead to a score) by the
Appropriate Relates to acceptable coaching staff. A MOP would be the probability
standards and analysis
objectives.
of completing a pass by the offense or the
defense’s probability of causing an incomplete
Sensitive Reflects changes in pass. The mix of players on the field would
system variables. determine the parameters (yards per carry, etc.) at
Inclusive Reflects those standards
any given time. Changing the quarterback would
required by the analysis change the probability of completing a pass.
objectives.
The Role of Time
Independent Mutually exclusive with
respect to other measures.
Time is often used as an effectiveness measure.
The concept of timeliness is attractive as an
Simple Easily understood by the effectiveness measure. However, this paper argues
user. that time is a parameter and thus should not be cast
in this role. Time is the independent variable and
Table 1. Desired Characteristics of outcomes of processes occur with respect to time.
Effectiveness Measures L’Etoile defines Critical Time (Tc) as the time
within which the mission (task) must be completed
Application to be successful [L’Etoile, 1985]. Using Tc as the
Figure 4 captures the relationship of the independent variable, the effectiveness measure is
parameters and effectiveness measures in the the probability of completing the task within the
performance prediction process. The parameters allowed time.
are input into the modeling process along with the
scenario requirements and environmental Summary
conditions. The effectiveness measures are the The system bounding process is the starting point
logical output. for determining effectiveness measures in that it
defines the set of parameters and their hierarchical
structure within the system of systems that drives
Dimensional
Parameters system performance. Effectiveness measures are
also hierarchical with MOPs determined by sets of
parameters, MOEs building off the aggregation of
MOPs, and MOSEs building from the MOEs.
MOPs Care must be taken to ensure that effectiveness
Scenario
Inputs Model measures reflect the systems objective. Care must
also be taken to not confuse parameters with
measures. If it can’t be expressed as a probability
MOEs
it probably is not an effectiveness measure.
Environmental
Inputs
Biography
MOFEs Mr. Green is a Senior Principal Systems Engineer
at Raytheon Naval and Marine Information
Systems. He was the chair of the MORS MOE
Scenario
working group for several years and was involved
Results in the original research upon which this paper is
based.

References
Figure 4. Modeling system Performance [Leite [Ackoff, 1971] Ackoff, Russell L., Towards a
and Mensh, 1999] System of Systems Concept, Management
Science, Vol. 17, No. 11, July 1971, pp661-
Consider the following example: a football team is 671.
a system of systems. Its effectiveness measure is

UNCLASSIFIED
4
UNCLASSIFIED

[Andriole and Halpin, 1991] Andriole, Stephen J. [Morse and Kimball, 1970] Morse, Philip M. and
and Stanley M. Halpin, editors. Information George E. Kimball. Methods of Operations
Technology for Command and Control: Research, Los Altos, CA: Penisula Publishing,
Methods and Tools for Systems Development 1970.
and Evaluation, Piscataway , NJ: IEEE Press, [Oliver et. al., 1997] Oliver D.W, Kelliher T.P.,
1991. Keegan J.G., Engineering Complex Systems with
[Athans, 1987] Athans, Michael. Command and Models and Objects, New York: Mc Graw-Hill,
Control (C2) Theory: A Challenge to Control 1997.
Science, IEEE Transactions on Automatic [Pawlowski, 1993a] Pawlowski, Thomas J. III,
Control, Vol. AC-32, no. 4, April 1987, pp. LTC. C3IEW Measures of Effectiveness
286-293. Workshop, Phalanx, March 1993, pp. 14-16.
[Bean, 1994] Bean, Theodore T.. System [Pawlowski, 1993b] Pawlowski, Thomas J. III,
Boundaries Within the MCES Paradigm, LTC, editor. Military Operations Research
Phalanx, June 1994, pp. 23-26. Society C3IEW Workshop, Final Report, Sept.
[Blanchard and Fabrycky, 1998] Blanchard, 6, 1993.
Benjamin S. and Wolter J. Fabrycky. Systems [Sovereign, et. al., 1994] Sovereign, M., Dr., W.
Engineering and Analysis, 3rd Ed., Upper Kempel, and J. Metzger. C3IEW Workshop II,
Saddle River, NJ: Prentice Hall, 1998. Phalanx, March 1994, p. 10-14.
[Hall, 1992] Hall, David L.. Mathematical [Sweet, 1986] Sweet, Ricki, Dr. Preliminary C2
Techniques in Multisensor Data Fusion, Evaluation Architecture, Signal, January 1986,
Boston: Artech House, 1992. pp. 71-73.
[Hwang, et. al.] Hwang, John, et. al. editors,, [Sweet, et. al., 1985] Sweet, Ricki, Dr., Dr.
Selected Analytical Concepts in Command and Morton Metersky, and Dr. Michael Sovereign.
Control, New York: Gordon and Breach Command and Control Evaluation Workshop,
Science Publishers, 1982. Military Operations Research Society, January
[Johnson and Levis, 1988] Johnson, Stuart E., Dr. 1985.
and Dr. Alexander H. Levis, editors. Science of [Sweet, et. al., 1986] Sweet, Ricki, Dr., et al.. The
Command and Control: Coping with Modular Command and Control Structure
uncertainty, Washington, D.C.: AFCEA Press, (MCES): Applications of and Expansion to C3
1988. Architectural Evaluation, Monterey: Naval
[Johnson and Levis, 1989] Johnson, Stuart E., Dr. Postgraduate School, 1986.
and Dr. Alexander H. Levis, editors. Science of [Sweet et al, 1987] Sweet, Ricki, Dr., MAJ Patrick
Command and Control: Part II, Coping with L. Gandee, USAF, and MAJ Michael D Gray,
complexity, Washington, D.C.: AFCEA Press, USAF. Evaluating Alternative Air Defense
1989. Architectures, Signal, January 1987, pp. 49-58.
[Leite and Mensh, 1999] Leite, Michael J. and [Sweet and Lopez, 1987] Sweet, Ricki, Dr. and Dr.
Dennis R. Mensh. Definition of Evaluation Armando LaForm Lopez. Testing the Modular
Criteria for System Development, Acquisition C2 Evaluation Structure and the Acquisition
Modeling, and Simulation, Naval Engineers Process, Signal, August 1987, pp. 75-79.
Journal, January 1999, pp.55-64. [Sweet and Levis, 1988] Sweet, Ricki, Dr. and Dr.
[L’Etoile, 1985] L’Etoile, A.S. NUSC Technical Alexander H. Levis. SuperCINC Architecture
Memoandum TM –85-2075, 30 December, Concept Definition and Evaluation, Signal,
1985 July 1988, pp. 65-68.
[Malerud et al, 1999] Malerud S., Feet E.H., [DSMC, 1999] Systems Engineering
Enemo G., and Brathen K., Assessing the Fundamentals, Defense Systems Management
Effectiveness of Maritime Systems – Measures College Press, Fort Belvior, VA October 1999.
of Merit. Proceedings of the 2000 Command [Waltz and Llinas, 1990] Waltz, Edward and
and Control Research and Technology James Llinas. Multisensor Data Fusion,
Symposium, Monterey California 2000. Boston: Artech House, 1990.
[Metersky, 1986] Metersky, M.L., A C2 Process
and an Approach to Design and Evaluation,
IEEE Transactions on Systems, Man, and
Cybernetics, Vol. SMC-16, no. 6, November
1986, pp. 880-889.

UNCLASSIFIED
5

You might also like