SlideShare a Scribd company logo
2
Most read
4
Most read
5
Most read
Performance
Test Plan
Version 1.00

Atul Pant
Audit Trail:
Date
April 2, 2013
April 9, 2013

Version
1.0
1.1

Performance Test Plan

Name
Atul
Atul

Comment
Initial Revision
First round of revisions

Page 1
Table of Contents
Reference Documents................................................................................................................................... 3
Objectives and Scope .................................................................................................................................... 3
Exclusions ...................................................................................................................................................... 3
Approach and Execution Strategy................................................................................................................. 3
Load/Stress Test Types and Schedules ......................................................................................................... 3
Test Measurements, Metrics, and Baseline .................................................................................................. 4
Performance/Capability Goals (Expected Results) and Pass/Fail Criteria .................................................... 5
Software and Tools Used .............................................................................................................................. 5
Load Descriptions.......................................................................................................................................... 5
Content and User Data Preparation ............................................................................................................. 6
Load Script Recording ................................................................................................................................... 7
Load Testing Process ..................................................................................................................................... 7
Training Needs .............................................................................................................................................. 7
System-Under-Test (SUT) Environment ........................................................................................................ 7
Test Deliverables ........................................................................................................................................... 8
Team Members and Responsibilities ............................................................................................................ 8
Risk Assessment and Mitigation ................................................................................................................... 8
List of Appendices ......................................................................................................................................... 9
Test Plan Approval ........................................................................................................................................ 9

Performance Test Plan

Page 2
Reference Documents
Performance Scalability Goals.xls

Objectives and Scope
The purpose of this document is to outline the environment and performance test plan for
benchmarking Sakai 2.5.0 core tools for use in WileyPLUS E5. In general the purposes of this testing are:
 Validate the core Sakai framework and certain tools meet the minimum performance standards
established for this project.
 Performance testing any new Sakai tools that are developed
 Performance testing any changes to Sakai Tools that are planned for WileyPLUS E5
 Performance testing any BackOffice applications or integrations

Exclusions
This test plan will not cover any functional or accuracy testing of the software being tested. This test
plan will not cover any browser or software compatibility testing.

Approach and Execution Strategy
Sakai will be tested using an existing Wiley performance test process. This test plan will serve as the
basis for Testware to create Silk Performer Test Scripts. These scripts will be run by Leo Begelman using
the Silk Performer software. Unicon, Inc. will watch and measure the CPU utilization of the web and
database servers used during testing. Unicon, Inc. will analyze and present the performance test results
to Wiley at the conclusion of the performance test cycle.

Load/Stress Test Types and Schedules
The following tests will be run:
 Capacity Test – Determines the maximum number of concurrent users that the application
server can support under a given configuration while maintaining an acceptable response time
and error rate.
 Consistent Load Test – Long-running stress test that drives a continuous load on the application
server for an extended period of time (at least 6 hours). The main purpose of this type of test is
to ensure the application can sustain acceptable levels of performance over an extended period
of time without exhibiting degradation, such as might be caused by a memory leak.
 Single Function Stress Test – A test where 100 users perform the same function with no wait
times and no ramp up time. This test will help determine how the application reacts to periods
of extreme test in a very narrow area of the code.

Performance Test Plan

Page 3
 Baseline Test – At the conclusion of the Capacity Test and Consistent Load Test a third test will
be established with the goal to be a repeatable test that can be performed when any portion of
the system is changed. This test will not have the secondary goals the other two tests have, and
will simply exist to be a known quantity rather than the breaking point values the other tests are
interested in.

Test Measurements, Metrics, and Baseline
The following metrics will be collected
Database Server:
 CPU Utilization – Max., Avg., and 95th percentile. This data will be collected using the sar system
utility.
 SQL query execution time: The time required to execute the top ten SQL queries involved in a
performance test run. This data will be collected using Oracle Stats Pack.

Application Server:
 Application Server CPU – Max., Avg., and 95th percentile. This data will be collected using the
sar system utility.
 Memory footprint: The memory footprint is the peak memory consumed by the application
while running. This data will be collected using the Java Virtual Machine (JVM) verbose garbage
collection logging.
 Time to last byte (TTLB): This is what will currently be measured in the stress tests, as opposed
to user-perceived response time. Time to last byte measures the time between the request
leaving the client machine and the last byte of the response being sent down from the server.
This time does not take in to account the scripting engine that must run in the browser, the
rendering, and other functions that can cause a user to experience poor performance. If the
client-side script is very complex this number and the user perceived response time can be
wildly different. A user will not care how fast the response reaches their machine (about the
user perceived response time) if they cannot interact with the page for an extended amount of
time. This data will be collected using Silk Performer.
Network:
Network Traffic: Network traffic analysis is one of the most important functions in performance testing.
It can help identify unnecessary transmissions, transmissions which are larger than expected, and those
that can be improved. We need to watch network traffic to identify the bytes over the wire being
transmitted, the response times, and the concurrent connections that are allowed. This data will be
collected using the sar system utility.

Performance Test Plan

Page 4
Performance/Capability Goals (Expected Results) and Pass/Fail Criteria
The following are performance requirements (success criteria) for the performance tests:







The average response time (measured by the Time to last byte metric) is less than 2.5 seconds
The worst response time (measured by the Time to last byte metric) is less than 30 seconds
The average CPU utilization of the database server is less than 75%
The average CPU utilization of the application server is less than 75%
Each blade server must be capable of handling 500 concurrent users
The maximum number of acceptable server errors, non HTTP-200 status codes on client
requests, will be less than 2% of all client requests.

Software and Tools Used
Component

Software Version

Sakai

Sakai 2.5.0 GA

Servlet Container

Tomcat 5.5.25

Java Virtual Machine

Sun Java Development Kit 1.5.0.14

Database

Oracle 10g (version 10.?.?.?)

Load Test Tool

Silk Performer

Load Descriptions
Each test outlined in section 5 will run with a ratio of 59 students to 1 instructor. There is no expected
difference between users logging in for the first time or subsequent logins given how the data (outlined
in section 10) will be created. The data set these tests will start with will appear to be in mid-course for
all users.
There will be no ramp up time for any of the Single Function Stress Tests. The ramp up time for all other
tests, should be set to 1 user every 3 seconds. 120 users should therefore be running within 6 minutes.
In order to place as much stress on the system as possible with a small number of users, all users should
come from different worksites.

Performance Test Plan

Page 5
Content and User Data Preparation
The Sakai system will need to be preloaded with data before the performance testing will begin. This
data will be created using the Sakai Shell and Sakai Web Services. Once the data is created it will be
extracted from the database with a database dump. The following table identifies several types of data
that will need to be preloaded into the Sakai environment.
Type of Data
Amount of Data
Users

93567

Students

92000

Instructors

1557

Administrators

10

Very Large Worksites

2

Large Worksites

5

Medium Worksites

50

Small Worksites

1500

Students to Instructors Ratio

59 to 1

Students per Very Large Worksite

1000

Students per Large Worksite

500

Students per Medium Worksite

250

Students per Small Worksite

50

Announcements per Worksite

13

Forum Topics per Worksite

1/3 of all forum posts in a worksite

Forum Posts per Worksite

(spread across topics) ¼ students in worksite *
6.5

Columns (Assignments) in Gradebook

13

Instructor Resource per Worksite

20

Performance Test Plan

Page 6
Load Script Recording
Testware will create new Silk Performer Test Scripts based on the two scenarios outlined in Appendices
1 and 2. The Test Suite should be set up to accommodate a ratio of 59 students to 1 instructor. Wait
time should be included between each page request, so that the total time for an activity is equal to the
number of page requests * 2.5 seconds + the total wait time.

Load Testing Process
This section details the load testing process that will be followed for all performance tests conducted as
described in this test plan.











Start data collection scripts
Stop the application
Remove any temporary files
Reset the database and file system to known starting points
Start the application and run a quick sanity test to make sure each application server can
successfully return login screen markup can successfully process a login request.
Request the Silk Performer scripts be started
Once the Silk Performer scripts have completed, stop the data collection scripts
Acquire any database specific data being collected
Collate the data for all the metrics specified into one report.
Make the report available in the Wiley Portal

Training Needs
Testware will need to be trained on the use of the Sakai environment. The scenarios outlined in
Appendix 1 and Appendix 2 give some idea about which parts of Sakai will be used, however, more
training will probably be required by the Test Script writers.

System-Under-Test (SUT) Environment
Specifying mixes of system hardware, software, memory, network protocol, bandwidth, etc
Network access variables


For example, 56K modem, 128K Cable modem, T1, etc.

ISP infrastructure variables


For example, first tier, second tier, etc.

Client baseline configurations



Computer variables
Browser variables

Performance Test Plan

Page 7
Server baseline configurations



Computer variables
System architecture variables and diagrams

Other questions to consider asking:






What is the definition of “system”?
How many other users are using the same resources on the system under test (SUT)?
Are you testing the SUT in its complete, real-world environment (with load balances, replicated
database, etc.)?
Is the SUT inside or outside the firewall?
Is the load coming from the inside or outside of the firewall?

Test Deliverables
The following test deliverables are expected as part of this performance testing effort.
Test Plan – This Document
Test Scripts – Silk Performer Test Scripts to implement the scenarios outlined in Appendix 1 and
Appendix 2
Test Results Data – The data resulting from the performance tests run
Test Results Final Report – The final report that documents and analyzes the results of the performance
tests that were conducted according to this test plan.

Team Members and Responsibilities
The following table defines the different team member responsibilities.
Responsibility

Team Member

Test Plan Creation

Atul

Silk Performer Test Script Creation

TestWare

Validation of Test Script Execution

Atul

Silk Performer Test Script Execution

Prerana

CPU Monitoring and Test Orchestration

Sam

Database Loading and Analysis

Ramu

Risk Assessment and Mitigation
Performance Test Plan

Page 8
Business
Risk: Unsatisfactory Performance of Sakai
Mitigation: Conduct performance testing of core Sakai at the beginning of the project. If Sakai does not
meet the goals established above, project management will have the most time possible to adjust
project plans.
IT
Risk: Limit on the number of virtual users available with Silk Performer
Mitigation: Test only one blade server per 500 virtual users available with Silk Performer
Risk: All Sakai tools needed for testing at this stage may not be available
Mitigation: Tests will be conducted against the core tools that are in the Sakai 2.5.0 release. Where a
tool that is needed is not yet available, and place holder tool has been specified in the test scenarios in
Appendix 1 and 2. (e.g., Calendar will be used in place of Student Gateway for this testing)

List of Appendices
Appendix 1 – Student test scenario
Appendix 2 – Instructor test scenario
Appendix 2 – Single function stress test scenarios

Test Plan Approval
Business Approval
__________________________________________________

Paul Harris

_____________
Date

IT Approval
__________________________________________________

Raghav

_____________
Date

Testing Approval
___________________________________________________

Venkat

Performance Test Plan

_____________
Date

Page 9

More Related Content

What's hot (20)

PDF
Testing check list
Atul Pant
 
PPT
Getting start with Performance Testing
Yogesh Deshmukh
 
PPTX
Performance Testing from Scratch + JMeter intro
Mykola Kovsh
 
PPT
Performance testing with Jmeter
Prashanth Kumar
 
PDF
Performance testing presentation
Belatrix Software
 
PPTX
Introduction to performance testing
Richard Bishop
 
PPTX
Performance testing using Jmeter for apps which needs authentication
Jay Jha
 
PPT
Performance and load testing
sonukalpana
 
DOCX
Loadrunner interview questions and answers
Garuda Trainings
 
PPTX
Performance Testing
Selin Gungor
 
PPT
Performance Testing
sharmaparish
 
PPTX
Load Testing and JMeter Presentation
Neill Lima
 
PPTX
Interpreting Performance Test Results
Eric Proegler
 
PPTX
Introduction to performance testing
Tharinda Liyanage
 
PPTX
Load and performance testing
Qualitest
 
PPT
Load Testing Strategy 101
iradari
 
PDF
Neoload
Kumar Gupta
 
PDF
Performance Testing Using JMeter | Edureka
Edureka!
 
PPTX
QA. Load Testing
Alex Galkin
 
PDF
LoadRunner Performance Testing
Atul Pant
 
Testing check list
Atul Pant
 
Getting start with Performance Testing
Yogesh Deshmukh
 
Performance Testing from Scratch + JMeter intro
Mykola Kovsh
 
Performance testing with Jmeter
Prashanth Kumar
 
Performance testing presentation
Belatrix Software
 
Introduction to performance testing
Richard Bishop
 
Performance testing using Jmeter for apps which needs authentication
Jay Jha
 
Performance and load testing
sonukalpana
 
Loadrunner interview questions and answers
Garuda Trainings
 
Performance Testing
Selin Gungor
 
Performance Testing
sharmaparish
 
Load Testing and JMeter Presentation
Neill Lima
 
Interpreting Performance Test Results
Eric Proegler
 
Introduction to performance testing
Tharinda Liyanage
 
Load and performance testing
Qualitest
 
Load Testing Strategy 101
iradari
 
Neoload
Kumar Gupta
 
Performance Testing Using JMeter | Edureka
Edureka!
 
QA. Load Testing
Alex Galkin
 
LoadRunner Performance Testing
Atul Pant
 

Viewers also liked (19)

PDF
How to start performance testing project
NaveenKumar Namachivayam
 
DOC
Test plan
Akhila Bhaskar
 
PDF
Test plan
Nadia Nahar
 
PPTX
An Introduction to Performance Testing
SWAAM Tech
 
DOCX
Performance testing interview questions and answers
Garuda Trainings
 
PDF
SAP performance testing & engineering courseware v01
Argos
 
PDF
Loadrunner vs Jmeter
Atul Pant
 
PDF
Testing Plan Test Case
guest4c6fd6
 
PDF
Sample test-plan-template
Dell R&D Center, Bangalore
 
PDF
Test plan on iit website
Samsuddoha Sams
 
PDF
sample-test-plan-template.pdf
empite
 
PDF
Testing plan for an ecommerce site
Immortal Technologies
 
PDF
6 Kick-Ass Tips to Improve SAP Performance
Basis Technologies
 
PPT
Qc dept open_sta overview
qc-pyramid
 
PPT
Best Practices In Load And Stress Testing Cmg Seminar[1]
Munirathnam Naidu
 
PDF
Know More About Rational Performance - Snehamoy K
Roopa Nadkarni
 
PDF
Anatomy of Bed Bug Tips to Control from Bed Bugs
Cummings Pest Control
 
PDF
Load Testing SAP Applications with IBM Rational Performance Tester
Bill Duncan
 
PPT
Performance Teting - VU Scripting Using Rational (http://www.geektester.blogs...
raj.kamal13
 
How to start performance testing project
NaveenKumar Namachivayam
 
Test plan
Akhila Bhaskar
 
Test plan
Nadia Nahar
 
An Introduction to Performance Testing
SWAAM Tech
 
Performance testing interview questions and answers
Garuda Trainings
 
SAP performance testing & engineering courseware v01
Argos
 
Loadrunner vs Jmeter
Atul Pant
 
Testing Plan Test Case
guest4c6fd6
 
Sample test-plan-template
Dell R&D Center, Bangalore
 
Test plan on iit website
Samsuddoha Sams
 
sample-test-plan-template.pdf
empite
 
Testing plan for an ecommerce site
Immortal Technologies
 
6 Kick-Ass Tips to Improve SAP Performance
Basis Technologies
 
Qc dept open_sta overview
qc-pyramid
 
Best Practices In Load And Stress Testing Cmg Seminar[1]
Munirathnam Naidu
 
Know More About Rational Performance - Snehamoy K
Roopa Nadkarni
 
Anatomy of Bed Bug Tips to Control from Bed Bugs
Cummings Pest Control
 
Load Testing SAP Applications with IBM Rational Performance Tester
Bill Duncan
 
Performance Teting - VU Scripting Using Rational (http://www.geektester.blogs...
raj.kamal13
 
Ad

Similar to Performance Test Plan - Sample 1 (20)

PPTX
Performance testing
Hassan Mohammed
 
PDF
Chapter 4 - Performance Testing Tasks
Neeraj Kumar Singh
 
PPTX
load_testing.ppt.pptx
PerformanceTesting1
 
PPTX
Performance testing
Chalana Kahandawala
 
PPTX
Performance testing and j meter overview
krishna chaitanya
 
PDF
typesofperformancetesting-130505055525-phpapp02.pdf
SRIRAMKIRAN9
 
PDF
Adding Performance Testing to a Software Development Project
Cris Holdorph
 
PPTX
Performance Testing using LoadRunner
Kumar Gupta
 
PPT
PERFTEST.ppt
hemanthKumar954692
 
PPT
PERFTEST.ppt
MeghanaAkkapalli
 
PPT
08-Performence_Testing Project Explain.ppt
pspc139
 
PDF
performancetestinganoverview-110206071921-phpapp02.pdf
MAshok10
 
PPTX
QSpiders - Introduction to JMeter
Qspiders - Software Testing Training Institute
 
PPTX
performance testing training in hyderabad
aparna3zen
 
PDF
Performance Testing.3zen.pdf
swathi3zen
 
PPTX
performance testing training in hyderabad
madhupriya3zen
 
PPTX
Performance Testing Training in Hyderabad
rajasrichalamala3zen
 
PDF
11.performance testing methodologies and tools
Alexander Decker
 
PDF
Performance testing methodologies and tools
Alexander Decker
 
PPTX
performance testing training in hyderabad
neeraja0480
 
Performance testing
Hassan Mohammed
 
Chapter 4 - Performance Testing Tasks
Neeraj Kumar Singh
 
load_testing.ppt.pptx
PerformanceTesting1
 
Performance testing
Chalana Kahandawala
 
Performance testing and j meter overview
krishna chaitanya
 
typesofperformancetesting-130505055525-phpapp02.pdf
SRIRAMKIRAN9
 
Adding Performance Testing to a Software Development Project
Cris Holdorph
 
Performance Testing using LoadRunner
Kumar Gupta
 
PERFTEST.ppt
hemanthKumar954692
 
PERFTEST.ppt
MeghanaAkkapalli
 
08-Performence_Testing Project Explain.ppt
pspc139
 
performancetestinganoverview-110206071921-phpapp02.pdf
MAshok10
 
QSpiders - Introduction to JMeter
Qspiders - Software Testing Training Institute
 
performance testing training in hyderabad
aparna3zen
 
Performance Testing.3zen.pdf
swathi3zen
 
performance testing training in hyderabad
madhupriya3zen
 
Performance Testing Training in Hyderabad
rajasrichalamala3zen
 
11.performance testing methodologies and tools
Alexander Decker
 
Performance testing methodologies and tools
Alexander Decker
 
performance testing training in hyderabad
neeraja0480
 
Ad

More from Atul Pant (6)

PDF
Sql
Atul Pant
 
PDF
Payment gateway testing
Atul Pant
 
PDF
Cloud computing
Atul Pant
 
PDF
Jmeter Performance Testing
Atul Pant
 
PDF
Unix command
Atul Pant
 
PDF
E commerce Testing
Atul Pant
 
Payment gateway testing
Atul Pant
 
Cloud computing
Atul Pant
 
Jmeter Performance Testing
Atul Pant
 
Unix command
Atul Pant
 
E commerce Testing
Atul Pant
 

Recently uploaded (20)

PPTX
OA presentation.pptx OA presentation.pptx
pateldhruv002338
 
PDF
The Future of Mobile Is Context-Aware—Are You Ready?
iProgrammer Solutions Private Limited
 
PDF
CIFDAQ's Market Wrap : Bears Back in Control?
CIFDAQ
 
PDF
Responsible AI and AI Ethics - By Sylvester Ebhonu
Sylvester Ebhonu
 
PDF
Generative AI vs Predictive AI-The Ultimate Comparison Guide
Lily Clark
 
PDF
Tea4chat - another LLM Project by Kerem Atam
a0m0rajab1
 
PPTX
Applied-Statistics-Mastering-Data-Driven-Decisions.pptx
parmaryashparmaryash
 
PDF
AI Unleashed - Shaping the Future -Starting Today - AIOUG Yatra 2025 - For Co...
Sandesh Rao
 
PPTX
The Future of AI & Machine Learning.pptx
pritsen4700
 
PDF
Make GenAI investments go further with the Dell AI Factory
Principled Technologies
 
PDF
Research-Fundamentals-and-Topic-Development.pdf
ayesha butalia
 
PDF
Market Insight : ETH Dominance Returns
CIFDAQ
 
PDF
OFFOFFBOX™ – A New Era for African Film | Startup Presentation
ambaicciwalkerbrian
 
PDF
Trying to figure out MCP by actually building an app from scratch with open s...
Julien SIMON
 
PPTX
Introduction to Flutter by Ayush Desai.pptx
ayushdesai204
 
PDF
Researching The Best Chat SDK Providers in 2025
Ray Fields
 
PDF
Per Axbom: The spectacular lies of maps
Nexer Digital
 
PDF
The Future of Artificial Intelligence (AI)
Mukul
 
PDF
Peak of Data & AI Encore - Real-Time Insights & Scalable Editing with ArcGIS
Safe Software
 
PDF
TrustArc Webinar - Navigating Data Privacy in LATAM: Laws, Trends, and Compli...
TrustArc
 
OA presentation.pptx OA presentation.pptx
pateldhruv002338
 
The Future of Mobile Is Context-Aware—Are You Ready?
iProgrammer Solutions Private Limited
 
CIFDAQ's Market Wrap : Bears Back in Control?
CIFDAQ
 
Responsible AI and AI Ethics - By Sylvester Ebhonu
Sylvester Ebhonu
 
Generative AI vs Predictive AI-The Ultimate Comparison Guide
Lily Clark
 
Tea4chat - another LLM Project by Kerem Atam
a0m0rajab1
 
Applied-Statistics-Mastering-Data-Driven-Decisions.pptx
parmaryashparmaryash
 
AI Unleashed - Shaping the Future -Starting Today - AIOUG Yatra 2025 - For Co...
Sandesh Rao
 
The Future of AI & Machine Learning.pptx
pritsen4700
 
Make GenAI investments go further with the Dell AI Factory
Principled Technologies
 
Research-Fundamentals-and-Topic-Development.pdf
ayesha butalia
 
Market Insight : ETH Dominance Returns
CIFDAQ
 
OFFOFFBOX™ – A New Era for African Film | Startup Presentation
ambaicciwalkerbrian
 
Trying to figure out MCP by actually building an app from scratch with open s...
Julien SIMON
 
Introduction to Flutter by Ayush Desai.pptx
ayushdesai204
 
Researching The Best Chat SDK Providers in 2025
Ray Fields
 
Per Axbom: The spectacular lies of maps
Nexer Digital
 
The Future of Artificial Intelligence (AI)
Mukul
 
Peak of Data & AI Encore - Real-Time Insights & Scalable Editing with ArcGIS
Safe Software
 
TrustArc Webinar - Navigating Data Privacy in LATAM: Laws, Trends, and Compli...
TrustArc
 

Performance Test Plan - Sample 1

  • 2. Audit Trail: Date April 2, 2013 April 9, 2013 Version 1.0 1.1 Performance Test Plan Name Atul Atul Comment Initial Revision First round of revisions Page 1
  • 3. Table of Contents Reference Documents................................................................................................................................... 3 Objectives and Scope .................................................................................................................................... 3 Exclusions ...................................................................................................................................................... 3 Approach and Execution Strategy................................................................................................................. 3 Load/Stress Test Types and Schedules ......................................................................................................... 3 Test Measurements, Metrics, and Baseline .................................................................................................. 4 Performance/Capability Goals (Expected Results) and Pass/Fail Criteria .................................................... 5 Software and Tools Used .............................................................................................................................. 5 Load Descriptions.......................................................................................................................................... 5 Content and User Data Preparation ............................................................................................................. 6 Load Script Recording ................................................................................................................................... 7 Load Testing Process ..................................................................................................................................... 7 Training Needs .............................................................................................................................................. 7 System-Under-Test (SUT) Environment ........................................................................................................ 7 Test Deliverables ........................................................................................................................................... 8 Team Members and Responsibilities ............................................................................................................ 8 Risk Assessment and Mitigation ................................................................................................................... 8 List of Appendices ......................................................................................................................................... 9 Test Plan Approval ........................................................................................................................................ 9 Performance Test Plan Page 2
  • 4. Reference Documents Performance Scalability Goals.xls Objectives and Scope The purpose of this document is to outline the environment and performance test plan for benchmarking Sakai 2.5.0 core tools for use in WileyPLUS E5. In general the purposes of this testing are:  Validate the core Sakai framework and certain tools meet the minimum performance standards established for this project.  Performance testing any new Sakai tools that are developed  Performance testing any changes to Sakai Tools that are planned for WileyPLUS E5  Performance testing any BackOffice applications or integrations Exclusions This test plan will not cover any functional or accuracy testing of the software being tested. This test plan will not cover any browser or software compatibility testing. Approach and Execution Strategy Sakai will be tested using an existing Wiley performance test process. This test plan will serve as the basis for Testware to create Silk Performer Test Scripts. These scripts will be run by Leo Begelman using the Silk Performer software. Unicon, Inc. will watch and measure the CPU utilization of the web and database servers used during testing. Unicon, Inc. will analyze and present the performance test results to Wiley at the conclusion of the performance test cycle. Load/Stress Test Types and Schedules The following tests will be run:  Capacity Test – Determines the maximum number of concurrent users that the application server can support under a given configuration while maintaining an acceptable response time and error rate.  Consistent Load Test – Long-running stress test that drives a continuous load on the application server for an extended period of time (at least 6 hours). The main purpose of this type of test is to ensure the application can sustain acceptable levels of performance over an extended period of time without exhibiting degradation, such as might be caused by a memory leak.  Single Function Stress Test – A test where 100 users perform the same function with no wait times and no ramp up time. This test will help determine how the application reacts to periods of extreme test in a very narrow area of the code. Performance Test Plan Page 3
  • 5.  Baseline Test – At the conclusion of the Capacity Test and Consistent Load Test a third test will be established with the goal to be a repeatable test that can be performed when any portion of the system is changed. This test will not have the secondary goals the other two tests have, and will simply exist to be a known quantity rather than the breaking point values the other tests are interested in. Test Measurements, Metrics, and Baseline The following metrics will be collected Database Server:  CPU Utilization – Max., Avg., and 95th percentile. This data will be collected using the sar system utility.  SQL query execution time: The time required to execute the top ten SQL queries involved in a performance test run. This data will be collected using Oracle Stats Pack. Application Server:  Application Server CPU – Max., Avg., and 95th percentile. This data will be collected using the sar system utility.  Memory footprint: The memory footprint is the peak memory consumed by the application while running. This data will be collected using the Java Virtual Machine (JVM) verbose garbage collection logging.  Time to last byte (TTLB): This is what will currently be measured in the stress tests, as opposed to user-perceived response time. Time to last byte measures the time between the request leaving the client machine and the last byte of the response being sent down from the server. This time does not take in to account the scripting engine that must run in the browser, the rendering, and other functions that can cause a user to experience poor performance. If the client-side script is very complex this number and the user perceived response time can be wildly different. A user will not care how fast the response reaches their machine (about the user perceived response time) if they cannot interact with the page for an extended amount of time. This data will be collected using Silk Performer. Network: Network Traffic: Network traffic analysis is one of the most important functions in performance testing. It can help identify unnecessary transmissions, transmissions which are larger than expected, and those that can be improved. We need to watch network traffic to identify the bytes over the wire being transmitted, the response times, and the concurrent connections that are allowed. This data will be collected using the sar system utility. Performance Test Plan Page 4
  • 6. Performance/Capability Goals (Expected Results) and Pass/Fail Criteria The following are performance requirements (success criteria) for the performance tests:       The average response time (measured by the Time to last byte metric) is less than 2.5 seconds The worst response time (measured by the Time to last byte metric) is less than 30 seconds The average CPU utilization of the database server is less than 75% The average CPU utilization of the application server is less than 75% Each blade server must be capable of handling 500 concurrent users The maximum number of acceptable server errors, non HTTP-200 status codes on client requests, will be less than 2% of all client requests. Software and Tools Used Component Software Version Sakai Sakai 2.5.0 GA Servlet Container Tomcat 5.5.25 Java Virtual Machine Sun Java Development Kit 1.5.0.14 Database Oracle 10g (version 10.?.?.?) Load Test Tool Silk Performer Load Descriptions Each test outlined in section 5 will run with a ratio of 59 students to 1 instructor. There is no expected difference between users logging in for the first time or subsequent logins given how the data (outlined in section 10) will be created. The data set these tests will start with will appear to be in mid-course for all users. There will be no ramp up time for any of the Single Function Stress Tests. The ramp up time for all other tests, should be set to 1 user every 3 seconds. 120 users should therefore be running within 6 minutes. In order to place as much stress on the system as possible with a small number of users, all users should come from different worksites. Performance Test Plan Page 5
  • 7. Content and User Data Preparation The Sakai system will need to be preloaded with data before the performance testing will begin. This data will be created using the Sakai Shell and Sakai Web Services. Once the data is created it will be extracted from the database with a database dump. The following table identifies several types of data that will need to be preloaded into the Sakai environment. Type of Data Amount of Data Users 93567 Students 92000 Instructors 1557 Administrators 10 Very Large Worksites 2 Large Worksites 5 Medium Worksites 50 Small Worksites 1500 Students to Instructors Ratio 59 to 1 Students per Very Large Worksite 1000 Students per Large Worksite 500 Students per Medium Worksite 250 Students per Small Worksite 50 Announcements per Worksite 13 Forum Topics per Worksite 1/3 of all forum posts in a worksite Forum Posts per Worksite (spread across topics) ¼ students in worksite * 6.5 Columns (Assignments) in Gradebook 13 Instructor Resource per Worksite 20 Performance Test Plan Page 6
  • 8. Load Script Recording Testware will create new Silk Performer Test Scripts based on the two scenarios outlined in Appendices 1 and 2. The Test Suite should be set up to accommodate a ratio of 59 students to 1 instructor. Wait time should be included between each page request, so that the total time for an activity is equal to the number of page requests * 2.5 seconds + the total wait time. Load Testing Process This section details the load testing process that will be followed for all performance tests conducted as described in this test plan.           Start data collection scripts Stop the application Remove any temporary files Reset the database and file system to known starting points Start the application and run a quick sanity test to make sure each application server can successfully return login screen markup can successfully process a login request. Request the Silk Performer scripts be started Once the Silk Performer scripts have completed, stop the data collection scripts Acquire any database specific data being collected Collate the data for all the metrics specified into one report. Make the report available in the Wiley Portal Training Needs Testware will need to be trained on the use of the Sakai environment. The scenarios outlined in Appendix 1 and Appendix 2 give some idea about which parts of Sakai will be used, however, more training will probably be required by the Test Script writers. System-Under-Test (SUT) Environment Specifying mixes of system hardware, software, memory, network protocol, bandwidth, etc Network access variables  For example, 56K modem, 128K Cable modem, T1, etc. ISP infrastructure variables  For example, first tier, second tier, etc. Client baseline configurations   Computer variables Browser variables Performance Test Plan Page 7
  • 9. Server baseline configurations   Computer variables System architecture variables and diagrams Other questions to consider asking:      What is the definition of “system”? How many other users are using the same resources on the system under test (SUT)? Are you testing the SUT in its complete, real-world environment (with load balances, replicated database, etc.)? Is the SUT inside or outside the firewall? Is the load coming from the inside or outside of the firewall? Test Deliverables The following test deliverables are expected as part of this performance testing effort. Test Plan – This Document Test Scripts – Silk Performer Test Scripts to implement the scenarios outlined in Appendix 1 and Appendix 2 Test Results Data – The data resulting from the performance tests run Test Results Final Report – The final report that documents and analyzes the results of the performance tests that were conducted according to this test plan. Team Members and Responsibilities The following table defines the different team member responsibilities. Responsibility Team Member Test Plan Creation Atul Silk Performer Test Script Creation TestWare Validation of Test Script Execution Atul Silk Performer Test Script Execution Prerana CPU Monitoring and Test Orchestration Sam Database Loading and Analysis Ramu Risk Assessment and Mitigation Performance Test Plan Page 8
  • 10. Business Risk: Unsatisfactory Performance of Sakai Mitigation: Conduct performance testing of core Sakai at the beginning of the project. If Sakai does not meet the goals established above, project management will have the most time possible to adjust project plans. IT Risk: Limit on the number of virtual users available with Silk Performer Mitigation: Test only one blade server per 500 virtual users available with Silk Performer Risk: All Sakai tools needed for testing at this stage may not be available Mitigation: Tests will be conducted against the core tools that are in the Sakai 2.5.0 release. Where a tool that is needed is not yet available, and place holder tool has been specified in the test scenarios in Appendix 1 and 2. (e.g., Calendar will be used in place of Student Gateway for this testing) List of Appendices Appendix 1 – Student test scenario Appendix 2 – Instructor test scenario Appendix 2 – Single function stress test scenarios Test Plan Approval Business Approval __________________________________________________ Paul Harris _____________ Date IT Approval __________________________________________________ Raghav _____________ Date Testing Approval ___________________________________________________ Venkat Performance Test Plan _____________ Date Page 9