10-Performance Systems Analysis
10-Performance Systems Analysis
of Performance Evaluation
Ch. 1 Introduction
Ch. 2 Common Mistakes and How
to Avoid Them
Ch. 3 Selection of Techniques and
Metrics
CH. 1 INTRODUCTION
Workload Characterization
Ex. (1.3) The number of packets lost on two links was measured for
four
file sizes
TABLEas shown
1.1 inLost
Packets Table
on 1.1. Which link is better?
Two Links
How many experiments are needed? How does one estimate the
performance impact of each factor?
1.1 Outline of Topics (6)
5. Performance simulations correctly.
In designing a simulation model, one has to select a language for
simulation, select seeds and algorithms for random-number
generation, decide the length of simulation run, and analyze the
simulation results.
Example 1.7
The throughputs of two systems A and B were measured in
transactions per second.
The results are shown in Table 1.2
A 20 10
B 10 20
ACM SIGSIM
: Special Interest Group on SIMulation – Simulation Digest
CMG
: Computer Measurement Group, Inc. – CMG Transactions
1.3 Professional Organizations,
Journals, and Conferences (2)
SIAM
: SIAM Review, SIAM Journal on Control &Optimization, SIAM Journal
on Numerical Analysis, SIAM Journal on Computing, SIAM Journal
on Scientific and Statistical Computing, and Theory of Probability &
Its Applications
1.3 Professional Organizations,
Journals, and Conferences (3)
ORSA
: Operations Research, ORSA Journal on Computing, Mathematics
of Operations Research, Operations Research Letters, and
Stochastic Models
What goals?
General- purpose
model
Particular model
2.1 Common Mistakes in
Performance Evaluation (2)
Biased Goals
The stating the goals becomes that of finding the right
metrics
and workloads for comparing the two systems, not that of
finding the metrics and workloads such that our system
turns
out better.
I’m a jury.Your statement is wrong.
Be unbiased.
Pick up as
my likes
Parameter Metric B
A
Workload C Factor D
2.1 Common Mistakes in
Performance Evaluation (4)
Analysis without Understanding the Problem
Defining a problem often takes up to 40% of the total effort.
A problem well stated is half solved.
Of the remaining 60%, a large share goes into designing
alternatives, interpretation of the results, and presentation
of
conclusions.
Model A
Final
results
Model B
2.1 Common Mistakes in
Performance Evaluation (5)
Compare MIPS
Network
Analytical
Simulation Modeling
2.1 Common Mistakes in
Performance Evaluation (8)
Overlooking Important Parameters
It is good idea to make a complete list of system and
workload
characteristics that affect the performance of the system.
System parameters
- quantum size : CPU allocation
- working set size : memory allocation
Workload parameters
- the number of users
- request arrival patterns
- priority
2.1 Common Mistakes in
Performance Evaluation (9)
Ignoring Significant Factors
Parameters that are varied in the study are called factors.
Not all parameters have an equal effect on the performance.
: if packet arrival rate rather than packet size affects the response
time
of a network gateway, it would be better to use several different
arrival rates in studying its performance.
It is important to identify those parameters, which, if varied, will
make a significant impact on the performance.
It is important to understand the randomness of various system and
workload parameters that affect the performance.
The choice of factors should be based on their relevance and not on
the analyst’s knowledge of the factors.
For unknown parameters, a sensitivity analysis, which shows the effect
of changing those parameters form their assumed values, should be
done to quantify the impact of the uncertainty.
2.1 Common Mistakes in
Performance Evaluation (10)
Inappropriate Experimental Design
Experimental design relates to the number of measurement
or
simulation experiments to be conducted and the
parameter
values used in each experiment.
The simple design may lead to wrong conclusions if the
parameters interact such that the effect of one parameter
depends upon the values of other parameters.
Better alternatives are the use of the full factorial
experimental designs and fractional factorial designs.
2.1 Common Mistakes in
Performance Evaluation (11)
Inappropriate Level of Detail
The level of detail used in modeling a system has a
significant
impact on the problem formulation.
Avoid formulations that are either too narrow or too broad.
A common mistake is to take the detailed approach when a
high-level model will do and vice versa.
It is clear that the goals of a study have a significant impact
on
what is modeled and how it is analyzed.
2.1 Common Mistakes in
Performance Evaluation (12)
No Analysis
One of the common problems with measurement projects is
that they are often run by performance analysts who are
good
in measurement techniques but lack data analysis
expertise.
They collect enormous amounts of data but do not know to
analyze or interpret it.
Let’s explain
5
how one can 3
4
2
use the results 1
2.1 Common Mistakes in
Performance Evaluation (13)
Erroneous Analysis
There are a number of mistakes analysts commonly make in
measurement, simulation, and analytical modeling, for
example,
taking the average of ratios and too short simulations.
Simulation time
2.1 Common Mistakes in
Performance Evaluation (14)
No Sensitivity Analysis
Often analysts put too much emphasis on the results of their
analysis, presenting it as fact rather than evidence.
Without a sensitivity analysis, one cannot be sure if the
conclusions would change if the analysis was done in a
slightly
different setting.
Without a sensitivity analysis, it is difficult to access the
relative
importance of various parameters.
2.1 Common Mistakes in
Performance Evaluation (15)
Ignoring Errors in Input
Often the parameters of interest cannot be measured.
The analyst needs to adjust the level of confidence on the
model output obtained from input data.
Input errors are not always equally distributed about the
mean.
Packet 512
octects
Transmi Receive
t buffer buffer
2.1 Common Mistakes in
Performance Evaluation (16)
Improper Treatment of Outliers
Values that are too high or too low compared to a majority of
values in a set are called outliers.
Outliers in the input or model output present a problem.
If an outlier is not caused by a real system phenomenon, it
should be ignored.
Deciding which outliers should be ignored and which should
be
included is part of the art of performance evaluation and
requires careful understanding of the system being
modeled.
2.1 Common Mistakes in
Performance Evaluation (17)
Assuming No Change in the Future
It is often assumed that the future will be the same as the
past.
A model based on the workload and performance observed in
the past is used to predict performance in the future.
The future workload and system behavior is assumed to be
the
same as that already measured.
The analyst and the decision makers should discuss this
assumption and limit the amount of time into the future
that
predictions are made.
2.1 Common Mistakes in
Performance Evaluation (18)
Ignoring Variability
It is common to analyze only the mean performance since
determining variability is often difficult, if not impossible.
If the variability is high, the mean alone may be misleading to
the decision makers.
Load
demand Weekly
Mean = 80
Not useful
Decision Analyst
maker
2.1 Common Mistakes in
Performance Evaluation (20)
Improper Presentation of Results
The eventual aim of every performance study is to help in
decision making.
The right metric to measure the performance of an analyst is
not the number of analyses performed but the number of
analyses that helped the decision makers.
I’m analyst.
Let’s explain Words, pictures, and graphs
the results of
the analysis
2.1 Common Mistakes in
Performance Evaluation (21)
Ignoring Social Aspects
Successful presentation of the analysis results requires two
types of skills: social and substantive.
- Writing and speaking : Social skills
- Modeling and data analysis : Substantive skills.
Acceptance of the analysis results requires developing a trust
between the decision makers and the analyst and
presentation
of the results to the decision makers in a manner
understandable to them.
Social skills are particularly important in presenting results that
are counter to the decision maker’s beliefs and values or that
require a substantial change in the design.
2.1 Common Mistakes in
Performance Evaluation (21)
Ignoring Social Aspects (cont.)
The presentation to the decision makers should have minimal
analysis jargon and emphasize the final results, while the
presentation to other analysts should include all the details
of
the analysis techniques.
Combining these two presentations into one could make it
meaningless for both audiences.
2.1 Common Mistakes in
Performance Evaluation (22)
Omitting Assumptions and Limitations
Assumptions and limitations of the analysis are often omitted
from the final report.
This may lead the user to apply the analysis to another
context
where the assumptions will not be valid.
Dual CPU
System
Timesharing system Different ALU system
3. Perform a number of
1. Request the service different instructions
4. Request queries
2. Send the packets
6. Response
5. Answer queries
2.2 A Systematic Approach to
Performance Evaluation (3)
Select Metrics
Select criteria to compare the performance.
Choose the metrics(criteria).
In general, the metrics are related to the speed, accuracy,
and
availability of services.
The performance of a network
: the speed(throughput, delay), accuracy(error rate), and
availability of the packets sent.
The performance of a processor
: the speed of (time taken to execute) various instructions
2.2 A Systematic Approach to
Performance Evaluation (4)
List Parameters
Make a list of all the parameters that affect performance.
The list can be divided into system parameters and workload
parameters.
System parameters
: Hardware/Software parameters
: These generally do not vary among various installations
of the
system.
Workload parameters
: Characteristics of user’s requests
: These vary form one installation to the next.
2.2 A Systematic Approach to
Performance Evaluation (5)
Network
Clinet
Server
System
Case Study 2.1 (3)
Services
Two types of channel calls
: remoter procedure call and remote pipe
The resources used by the channel calls depend upon the
number of parameters passed and the action required on
those
parameters.
Data transfer is chosen as the application and the calls will be
classified simply as small or large depending upon the
amount
of data to be transferred to the remote machine.
The system offers only two services
: small data transfer or large data transfer
Case Study 2.1 (4)
Metrics
Due to resource limitations, the errors and failures will not be
studied. Thus, the study will be limited to correct operation
only.
Resources : local computer(client), the remote computer(server),
and the network link
Performance Metrics
- Elapsed time per call
- Maximum call rate per unit of time or equivalently, the time
required to complete a block of n successive calls
- Local CPU time per call
- Remote CPU time per call
- Number of bytes sent on the link per call
Case Study 2.1 (5)
Parameters
System Parameter
Speed of the local CPU, the remote CPU, and the network
Operating system overhead for interfacing with the
channels
Operating system overhead for interfacing with the
networks
Reliability of the network affecting the number of
retransmissions required
Workload Parameters
Time between successive calls
Number and sizes of the call parameters
Number and sizes of the results
Type of channel
Other loads on the local and remote CPUs
Other loads on the network
Case Study 2.1 (6)
Factors
Type of channel
: Two type – remote pipes and remote procedure calls
Speed of the network
: Two locations of the remote hosts will be used – short distance(in
the
campus) and long distance(across the country)
Sizes of the call parameters to be transferred
: Two levels will be used – small and large
Number n of consecutive calls
: Eleven different values of n – 1,2,4,8,16,32,,512,1024
All other parameters will be fixed.
The retransmissions due to network errors will be ignored.
Experiments will be conducted when there is very little other load on
the hosts and the network.
Case Study 2.1 (7)
Evaluation Technique
Since prototypes of both types of channels have already been
implemented, measurements will be used for evaluation.
Analytical modeling will be used to justify the consistency of
measured values for different parameters.
Workload
A synthetic program generating the specified types of
channel
requests
This program will also monitor the resources consumed and
log
the measured results(using Null channel requests).
Case Study 2.1 (8)
Experimental Design
A full factorial experimental design with 2311=88
experiments will be used for the initial study.
Data Analysis
Analysis of variance will be used to quantify the effects of the
first three factors and regression will be used to quantify
the effects of the number n of successive calls.
Data Presentation
The final results will be plotted as a function of the block size
n.