MOP and MOE
MOP and MOE
Distribution/Availability Statement
Approved for public release, distribution unlimited
Supplementary Notes
Papers from the Proceedings AIAA 2nd Biennial National Forum on Weapon System Effectiveness, held at
the John Hopkins University/Applied Physics Laboratory, 27-29 March 2001. Controlling Agency is OSD,
Pentagon, Washington DC, per Richard Keith, Editor. See also ADM201408, non-print version (whole
conference). , The original document contains color images.
Abstract
Subject Terms
Number of Pages
5
UNCLASSIFIED
No
Feasible
Solution
2. 0
Define
Effectiveness
Measures
1. 0 3. 0 5. 0
6. 0
Assess Create Perform
Available & Behavior Trade
Create Sequential
Build & Test Plan
Information Model Studies
Feasible
Solution
4. 0
Create
Structural
Model
UNCLASSIFIED
2
UNCLASSIFIED
Measures of Performance (MOP): measures determine to what extent user requirements are
derived from the dimensional parameters (both met. They are the criteria used to make the trade-
physical and structural) and measure attributes of off decisions of what to build. They are the
system behavior. MOPs quantify the set of criteria that drive the system solution that is found.
selected parameters. Examples include sensor
detection probability, sensor probability of false Effectiveness measures are derived from first
alarm, and probability of correct identification. principles. They are mission and scenario
dependent and must discriminate between choices.
Measures of Effectiveness (MOE): measure of System parameters are often mistakenly used as
how a system performs its functions within its effectiveness measures. As an example, sensor
environment. An MOE is generally an search rate has been specified as an MOP,
aggregation of MOPs. Examples include however, increasing the search rate of a sensor
survivability, probability of raid annihilation, and improves the probability of detection thus search
weapon system effectiveness. rate is a parameter.
In the MORS work the term Measures of Force Effectiveness measures must be both measurable
Effectiveness was defined as: and testable. This means that they are quantitative
Measures of Force Effectiveness (MOFE): in nature. Further, they have to realistically
measure of how a system and the force (sensors, measure the systems purpose or objective. Failure
weapons, C3 system) of which it is a part perform to do so results in a system that fails to meet its
military missions. purpose.
This last definition can be modified for the general The issue of sensitivity is important. The
systems case as follows: effectiveness measure(s) not only needs to reflect a
Measures of Systems Effectiveness (MOSE): change in the parameter set, it must also have a
measure of how a system of systems performs its reference from which the change can be evaluated.
mission. Doubling the value of a parameter does not
necessarily correspond to a doubling of the
The relationship between the various elements of effectiveness measure. Expressing MOPs, MOEs,
the hierarchy is shown in Figure 3. and MOSEs as a probability allows us to
determine if a parametric change is statistically
Environment
significant.
Characteristics Definition
MOEs
Mission Oriented Relates to force/system.
UNCLASSIFIED
3
UNCLASSIFIED
References
Figure 4. Modeling system Performance [Leite [Ackoff, 1971] Ackoff, Russell L., Towards a
and Mensh, 1999] System of Systems Concept, Management
Science, Vol. 17, No. 11, July 1971, pp661-
Consider the following example: a football team is 671.
a system of systems. Its effectiveness measure is
UNCLASSIFIED
4
UNCLASSIFIED
[Andriole and Halpin, 1991] Andriole, Stephen J. [Morse and Kimball, 1970] Morse, Philip M. and
and Stanley M. Halpin, editors. Information George E. Kimball. Methods of Operations
Technology for Command and Control: Research, Los Altos, CA: Penisula Publishing,
Methods and Tools for Systems Development 1970.
and Evaluation, Piscataway , NJ: IEEE Press, [Oliver et. al., 1997] Oliver D.W, Kelliher T.P.,
1991. Keegan J.G., Engineering Complex Systems with
[Athans, 1987] Athans, Michael. Command and Models and Objects, New York: Mc Graw-Hill,
Control (C2) Theory: A Challenge to Control 1997.
Science, IEEE Transactions on Automatic [Pawlowski, 1993a] Pawlowski, Thomas J. III,
Control, Vol. AC-32, no. 4, April 1987, pp. LTC. C3IEW Measures of Effectiveness
286-293. Workshop, Phalanx, March 1993, pp. 14-16.
[Bean, 1994] Bean, Theodore T.. System [Pawlowski, 1993b] Pawlowski, Thomas J. III,
Boundaries Within the MCES Paradigm, LTC, editor. Military Operations Research
Phalanx, June 1994, pp. 23-26. Society C3IEW Workshop, Final Report, Sept.
[Blanchard and Fabrycky, 1998] Blanchard, 6, 1993.
Benjamin S. and Wolter J. Fabrycky. Systems [Sovereign, et. al., 1994] Sovereign, M., Dr., W.
Engineering and Analysis, 3rd Ed., Upper Kempel, and J. Metzger. C3IEW Workshop II,
Saddle River, NJ: Prentice Hall, 1998. Phalanx, March 1994, p. 10-14.
[Hall, 1992] Hall, David L.. Mathematical [Sweet, 1986] Sweet, Ricki, Dr. Preliminary C2
Techniques in Multisensor Data Fusion, Evaluation Architecture, Signal, January 1986,
Boston: Artech House, 1992. pp. 71-73.
[Hwang, et. al.] Hwang, John, et. al. editors,, [Sweet, et. al., 1985] Sweet, Ricki, Dr., Dr.
Selected Analytical Concepts in Command and Morton Metersky, and Dr. Michael Sovereign.
Control, New York: Gordon and Breach Command and Control Evaluation Workshop,
Science Publishers, 1982. Military Operations Research Society, January
[Johnson and Levis, 1988] Johnson, Stuart E., Dr. 1985.
and Dr. Alexander H. Levis, editors. Science of [Sweet, et. al., 1986] Sweet, Ricki, Dr., et al.. The
Command and Control: Coping with Modular Command and Control Structure
uncertainty, Washington, D.C.: AFCEA Press, (MCES): Applications of and Expansion to C3
1988. Architectural Evaluation, Monterey: Naval
[Johnson and Levis, 1989] Johnson, Stuart E., Dr. Postgraduate School, 1986.
and Dr. Alexander H. Levis, editors. Science of [Sweet et al, 1987] Sweet, Ricki, Dr., MAJ Patrick
Command and Control: Part II, Coping with L. Gandee, USAF, and MAJ Michael D Gray,
complexity, Washington, D.C.: AFCEA Press, USAF. Evaluating Alternative Air Defense
1989. Architectures, Signal, January 1987, pp. 49-58.
[Leite and Mensh, 1999] Leite, Michael J. and [Sweet and Lopez, 1987] Sweet, Ricki, Dr. and Dr.
Dennis R. Mensh. Definition of Evaluation Armando LaForm Lopez. Testing the Modular
Criteria for System Development, Acquisition C2 Evaluation Structure and the Acquisition
Modeling, and Simulation, Naval Engineers Process, Signal, August 1987, pp. 75-79.
Journal, January 1999, pp.55-64. [Sweet and Levis, 1988] Sweet, Ricki, Dr. and Dr.
[L’Etoile, 1985] L’Etoile, A.S. NUSC Technical Alexander H. Levis. SuperCINC Architecture
Memoandum TM –85-2075, 30 December, Concept Definition and Evaluation, Signal,
1985 July 1988, pp. 65-68.
[Malerud et al, 1999] Malerud S., Feet E.H., [DSMC, 1999] Systems Engineering
Enemo G., and Brathen K., Assessing the Fundamentals, Defense Systems Management
Effectiveness of Maritime Systems – Measures College Press, Fort Belvior, VA October 1999.
of Merit. Proceedings of the 2000 Command [Waltz and Llinas, 1990] Waltz, Edward and
and Control Research and Technology James Llinas. Multisensor Data Fusion,
Symposium, Monterey California 2000. Boston: Artech House, 1990.
[Metersky, 1986] Metersky, M.L., A C2 Process
and an Approach to Design and Evaluation,
IEEE Transactions on Systems, Man, and
Cybernetics, Vol. SMC-16, no. 6, November
1986, pp. 880-889.
UNCLASSIFIED
5