GuidewireClaimCenter Performance TestPlan
GuidewireClaimCenter Performance TestPlan
HSB
Revision History
Date Aug 02, 2013 Version 1.0 Initial Version Description Author Venkateswararao Dodda
HSB
The below referenced deliverable has been reviewed and accepted by:
Name(s)
Date
Business Analyst
IS Lead Project Manager Integration Development Lead Database Lead QA Lead Performance Engineer
If not approving, note your issues and explain your position below:
HSB
Index
GW Claim Center Performance Testing ....................................................................... 3 APPROVAL SIGN OFF ........................................................................................................ 3 Index ................................................................................................................................... 4 1. Introduction .................................................................................................................. 5 1.1 Description ............................................................................................................. 5 1.2 Purpose.................................................................................................................... 5 1.3 Scope ....................................................................................................................... 5 2. Performance Testing Approach................................................................................. 6 3. Workload Distribution ............................................................................................... 10 3.1 Workload Distribution for Guidewire ClaimCenter ....................................... 10 3.2 Details of Application under Test ..................................................................... 10 3.2.1 Guidewire ClaimCenter .................................................................................. 10 4. Module Details............................................................................................................ 10 6. Schedule Tests ........................................................................................................... 11 7. Deliverables ................................................................................................................ 14 8. Project Closeout ........................................................................................................ 15 9. RISKS ............................................................................................................................ 15 10. Assumptions.............................................................................................................. 16 11. Milestones ................................................................................................................. 17 12. Stakeholders ............................................................................................................. 17 13. System Architecture .......................................................................................... 17 13.1 Test Environment .............................................................................................. 17 13.2 System Architecture (Production) .................................................................. 17
HSB
1. Introduction
1.1 Description
The goal of this project is to implement Guidewire ClaimCenter System, to integrate it with other existing HSB IT Systems, so as to seamlessly migrate users of existing claim system onto the new system and provide continuity of business processes. Performance Testing is considered to carry out a sanity performance testing to understand that the application is performing as required under defined user load.
1.2 Purpose
Performance tests will be conducted to validate the performance of Guidewire ClaimCenter application and verify if the performance requirements are met.
1.3 Scope
In Scope
Guidewire ClaimCenter application will be tested in Production and Stage environment and will be limited to the following in scope: Application study Requirements Gathering & Analysis (for performance tests to be carried out) Preparation of Performance test design with all scenarios & workload model Aid in Test Data Preparation (For accelerating the process) Stress & Load Test Execution High-level analysis of results obtained after the test execution Test Scripting considering 10 test script for the current scope (10 for each Production & Stage Environment)
o o o o o Each Test Script will have minimum of 15 steps (step mean mouse or keyboard event) and maximum of 50 steps (step mean mouse or keyboard event) Each Test Script will have minimum of 1 page & maximum of 13 pages Each Test Script will have maximum of 25 transactions & 13 parameters Each Test Script will have maximum of 7 correlations Each Test Script will have maximum of 3 unique test-data parameters
Out of Scope
Following are the requirements that would be out of scope for L&T Infotech team: Hardware capacity management Network capacity management Hardware maintenance
HSB
Guidewire Claim Center Network maintenance Baseline server/system parameters Any other activity which is not mentioned in the scope of work Performance Testing of web services or any other third party interfaces File/Image comparison Performance Testing of any other application (dependent, independent, interfaced, or otherwise) other than the one mentioned in the scope Performance testing of pages involving : RIA (Rich internet application) objects Java/Flash objects
HSB
Guidewire Claim Center Infotech L&T Infotech HSB L&T Infotech L&T Infotech L&T Infotech L&T Infotech HSB L&T Infotech L&T Infotech L&T Infotech L&T Infotech L&T Infotech L&T Infotech L&T Infotech L&T Infotech L&T Infotech HSB L&T Infotech L&T Infotech HSB L&T Infotech L&T Infotech
Run a dummy script to check the connectivity between controller & LoadRunner Access to Environment - for scripting purpose Protocol finalization Test Step Documentation - Max 10 Test Script preparation - Max 10 Validate Test Scripts Preparation & Script Preparation Test Data preparation, if required Smoke Test Execution Preparation Smoke Test & address Modifications in scripts or data or application, if required Performance Test Execution preparation Stress Test Execution # 1 (To find out breaking point of the system - Considering max 500 VUsers) Stress Test Execution # 2 (For averaging purpose) Re-Stress Test Execution # 1 (To find out breaking point of the system - Considering max 500 Vusers - post fixes) Re-Stress Test Execution # 2 (For averaging purpose - post fixes) Load Test Execution # 1 (To find performance of the application at user load of 230) Load Test Execution # 2 (For Averaging Purpose) Execution Fixing issues in Environment Load Re-Test Execution # 1 (To find performance of the application at user load of 230 - post fixes) Load Re-Test Execution # 2 (For Averaging Purpose - post fixes) Fixing issues in Environment Load Re-Test Execution # 1 (To find performance of the application at user load of 230 - post more fixes) Load Re-Test Execution # 2 (For Averaging Purpose - post more fixes)
HSB L&T Infotech HSB HSB HSB HSB L&T Infotech HSB HSB HSB HSB HSB HSB HSB HSB HSB L&T Infotech HSB HSB L&T Infotech HSB HSB
HSB
Guidewire Claim Center The following table summarizes the overall interaction with multiple stakeholders in order to execute this engagement smoothly and effectively. R Responsible A - Accountable C Consulted I Informed NA Not Applicable
Sr. No
Activity
Sub Activity
Developme nt Team
DBA HSB
Architects HSB
Responsible
Consulted
Responsible
Accountable
NA
NA
NA
2 3 4 5 6 7 8 9
Application Study Test Tool Set up Finalization of Test Cases Test Step Documentation Test Scripting & Standardization Test Data Preparation Test Execution Setup Dry run
Responsible NA Responsible Responsible NA Informed Informed Informed Informed Responsible Informed Responsible Informed
Responsible Accountable Informed Accountable Accountable Accountable Accountable Accountable Accountable Accountable Accountable Accountable Responsible
Accountable Responsible Accountable Informed Informed Informed Informed Informed Informed Informed Informed Informed Accountable
Consulted NA Consulted Consulted Consulted Consulted Informed Consulted Consulted Responsible Consulted Responsible Informed
Test Execution
HSB
Guidewire Claim Center The following table summarizes roles & Responsibilities of each stakeholder: Role Engagement Manager (L&T Infotech) Responsibilities Performance Tester (L&T Infotech) Business / IT Team (HSB) Networking/Infrastruc ture Team (HSB) Functional Team (HSB) Serves as an escalation point for Project Delivery Tracks Project delivery & status Instrumental in ramp up/ramp down of resources, as required SPOC from L&T Infotech for additional project requirements and for commercial queries Participate in governance meeting Overall Delivery Management of projects Ensuring smooth operations Resource Management & Query/Issue Management Understand & finalize the performance requirements with help of SPOC Prepare execution approach for Performance Testing Coordinate with Functional Team for Test Case Finalization Coordinate with DBA Team for monitoring counters, Report Analysis, Database Backup/Restore Setup of Load Generators & Monitoring tools with help of Infrastructure Team Prepare Test Scripts & Execution Plans and Perform Test Execution & Reporting along with Performance Tester(s) Document Test Steps for all the Test Cases identified with help of Functional Team Prepare Test Scripts for Performance Testing and Test Execution Plan Schedule Performance Test Execution, Perform Test Execution & prepare report Perform re-executions as per requirement & prepare final performance report, if planned Setup monitoring for the servers Identify major pain areas in the application - Performance Contribute towards analyzing the Performance Test Results Provide recommendations on the application issues identified during Performance Testing Facilitate Database Backup/Restore during the Performance Test Execution cycles Monitor the logs of the application during Performance Test Execution Tune the application for potential performance issues identified Finalize requirements, Test Cases to be considered, approach to be followed Provide final signoffs on the deliverables Facilitate the project execution Help Setup the 'Test Environment, 'Load Generators' & 'Development Machines' Contribute towards analyzing the Performance Test Results Provide recommendations on the infrastructure/network issues identified during Performance Testing Facilitate Ramp-up/down of Infrastructure - for Performance Test Purpose Prepare Test Cases flows Help Testing Team with the navigation of Test Cases Prepare Test Data for Test Script Preparation & Execution
HSB
3. Workload Distribution
A Workload Distribution is a representation of the functions performed by a user community on a system. A Workload Distribution is based on a percentage of users performing a specific function over a given period of time.
App Server (Prod) CM Server (Prod) Batch Server (Prod) DB Server (Prod) DB Server (Prod)
4. Module Details
4.1 Guidewire ClaimCenter
GWClaimCenter_Perf ormanceTest_TestCases_Steps.xlsx
10
HSB
6. Schedule Tests
Schedule tests would include those activities that are mandatory to validate the performance of the system. These tests need to be executed even if no performance issues are detected and no tuning is required. There are only two activities that can be conducted in this aspect. They are Stress Test Load Test Following are the various scenarios that can be considered for performance testing:Scenario 1 Guidewire ClaimCreation Low Load: This scenario comprises of test script pertaining to CRT and Phoenix policies in which Claim Center will be able to create claims for both Assumed and Direct policies. Test Scenario Number Test Scenario Name Script Name Scenario 1 Low Load 20 users Claim Creation - Low Load Claims/Executions per 1 Hour TC02_IntakeQueueNewClaimForCRTAssumedPolicy 40 TC03_IntakeQueueNewClaimForDirectPolicy 30 TC04_IntakeQueueNewClaimForPhoenixAssumedPolicy 30 Total 100 Scenario 2 Guidewire ClaimCreation Medium Load: This scenario comprises of test script pertaining to CRT and Phoenix policies in which Claim Center will be able to create claims for both Assumed and Direct policies. Test Scenario Number Test Scenario Name Script Name Scenario 2 Medium Load 30 users Claim Creation - Medium Load Claims/Executions per 1 Hour TC02_IntakeQueueNewClaimForCRTAssumedPolicy 50 TC03_IntakeQueueNewClaimForDirectPolicy 50 TC04_IntakeQueueNewClaimForPhoenixAssumedPolicy 50 Total 150
Scenario 3 Guidewire ClaimCreation High Load: This scenario comprises of test script pertaining to CRT and Phoenix policies in which Claim Center will be able to create claims for both Assumed and Direct policies. Test Scenario Number Test Scenario Name Script Name Scenario 3 High Load 40 users Claim Creation - High Load Claims/Executions per 1 Hour TC02_IntakeQueueNewClaimForCRTAssumedPolicy 80 TC03_IntakeQueueNewClaimForDirectPolicy 60 TC04_IntakeQueueNewClaimForPhoenixAssumedPolicy 60
11
HSB
Scenario 4 Guidewire ClaimCenter All Scripts Low load: This scenario comprises of test script pertaining to CRT and Phoenix policies in which Claim Center will be able to create claims for both Assumed and Direct policies along with policy search, attaching documents to existing claims and search for existing claims. Test Scenario Number Test Scenario Name Script Name Scenario 4 All Functions 100 users Claim Center - All Functions Low Load Claims/Executions per 1 Hour TC01_PolicySearch 50 TC02_IntakeQueueNewClaimForCRTAssumedPolicy 100 TC03_IntakeQueueNewClaimForDirectPolicy 100 TC04_IntakeQueueNewClaimForPhoenixAssumedPolicy 100 TC05_AttachDocumentsForExistingClaim 100 TC06_SearchExistingClaims 50 Total 500
Scenario 5 Guidewire ClaimCenter All Scripts Medium Load: This scenario comprises of test script pertaining to CRT and Phoenix policies in which Claim Center will be able to create claims for both Assumed and Direct policies along with policy search, attaching documents to existing claims and search for existing claims. Test Scenario Number Test Scenario Name Script Name Scenario 5 All Functions 150 users Claim Center - All Functions Medium Load Claims/Executions per 1 Hour TC01_PolicySearch 100 TC02_IntakeQueueNewClaimForCRTAssumedPolicy 150 TC03_IntakeQueueNewClaimForDirectPolicy 150 TC04_IntakeQueueNewClaimForPhoenixAssumedPolicy 150 TC05_AttachDocumentsForExistingClaim 100 TC06_SearchExistingClaims 100 Total 750
Scenario 6 Guidewire ClaimCenter All Scripts High Load: This scenario comprises of test script pertaining to CRT and Phoenix policies in which Claim Center will be able to create claims for both Assumed and Direct policies along with policy search, attaching documents to existing claims and search for existing claims. Test Scenario Number Test Scenario Name Script Name Scenario 6 All Functions 200 users Claim Center - All Functions High Load Claims/Executions per 1 Hour TC01_PolicySearch 125 TC02_IntakeQueueNewClaimForCRTAssumedPolicy 200 TC03_IntakeQueueNewClaimForDirectPolicy 200 TC04_IntakeQueueNewClaimForPhoenixAssumedPolicy 200 TC05_AttachDocumentsForExistingClaim 150
12
HSB
Scenario 7 Guidewire ClaimCenter Peak Load: This scenario comprises of test script pertaining to CRT and Phoenix policies in which Claim Center will be able to create claims for both Assumed and Direct policies along with policy search, attaching documents to existing claims and search for existing claims. Test Scenario Number Test Scenario Name Script Name Scenario 7 All Functions 300 users Claim Center - All Functions Peak Load Claims/Executions per 1 Hour TC01_PolicySearch 200 TC02_IntakeQueueNewClaimForCRTAssumedPolicy 300 TC03_IntakeQueueNewClaimForDirectPolicy 300 TC04_IntakeQueueNewClaimForPhoenixAssumedPolicy 300 TC05_AttachDocumentsForExistingClaim 200 TC06_SearchExistingClaims 200 Total 1500
Scenario 8 Guidewire ClaimCenter Stress Test: This scenario comprises of test script pertaining to CRT and Phoenix policies in which Claim Center will be able to create claims for both Assumed and Direct policies along with policy search, attaching documents to existing claims and search for existing claims. Test Scenario Number Test Scenario Name Script Name Scenario 8 All Functions 500 users Claim Center - All Functions Stress Test Claims/Executions per 1 Hour TC01_PolicySearch 325 TC02_IntakeQueueNewClaimForCRTAssumedPolicy 500 TC03_IntakeQueueNewClaimForDirectPolicy 500 TC04_IntakeQueueNewClaimForPhoenixAssumedPolicy 500 TC05_AttachDocumentsForExistingClaim 350 TC06_SearchExistingClaims 325 Total 2500
Scenario 9 Guidewire ClaimCenter Soak Test: This scenario comprises of test script pertaining to CRT and Phoenix policies in which Claim Center will be able to create claims for both Assumed and Direct policies along with policy search and search for existing claims. This test will be executed with minimum load for 4 hrs. Test Scenario Number Test Scenario Name Script Name Scenario 9 All Functions 100 users Claim Center - All Functions Soak Test Claims/Executions per 1 Hour TC01_PolicySearch 100 TC02_IntakeQueueNewClaimForCRTAssumedPolicy 100 TC03_IntakeQueueNewClaimForDirectPolicy 100 TC04_IntakeQueueNewClaimForPhoenixAssumedPolicy 100
13
HSB
Scenario 10 Guidewire ClaimCenter Longevity Test: This scenario comprises of test script pertaining to CRT and Phoenix policies in which Claim Center will be able to create claims for both Assumed and Direct policies along with policy search and search for existing claims. This test will be executed with minimum load for 12 hrs. Test Scenario Number Test Scenario Name Script Name Scenario 10 All Functions 50 users Claim Center - All Functions Longevity Test Claims/Executions per 1 Hour TC01_PolicySearch 50 TC02_IntakeQueueNewClaimForCRTAssumedPolicy 50 TC03_IntakeQueueNewClaimForDirectPolicy 50 TC04_IntakeQueueNewClaimForPhoenixAssumedPolicy 50 TC06_SearchExistingClaims 50 Total 250
7. Deliverables
a. From L&T Infotech L&T Infotech would deliver the following during the course of the project: Performance Test plan and strategy document Validated Test Scripts Test scenarios Execution Summary Report containing following information:
o o o o o Average Transaction Response Time Throughput details CPU & RAM Utilization Bottleneck details Closure Report
b. From HSB L&T Infotech expects following deliverables from HSB during the course of the project: Finalized list of test cases to be considered for performance testing
14
HSB
8. Project Closeout
The Project Closeout aspect only has one activity, but that activity is important and independent enough to deserve an aspect of its own. That activity is: Document Results Results document would include: Spreadsheets of recorded times for each user load Spreadsheet of average times grouped by user load Test execution reports for each test executed
9. RISKS
L&T Infotech has a well-defined risk management process to identify the potential risks in advance and focus on preventing them in order to ensure proper execution and completion of project. A Risk Management process is followed, wherein all the risks for a project are identified and maintained in a Risk Monitoring Log (RML). For each of the risk the probability of occurrence, impact and action plan for managing or mitigating the risk is identified. The risks listed in the RML are assessed and monitored during the weekly project status review meetings. Any new risk that is identified during the execution of the project is added to the RML and monitored closely. L&T Infotech has identified the following risk factors along with the possible mitigation plan. L&T Infotech together with HSB will chalk out the risk mitigation plan at the start of the project. These risks would be revisited/ monitored periodically to ensure smooth project delivery. S. No 1. Risk Clarity and stability of requirements (changes in requirements, scope creep, etc during project execution) Timely approval/ sign-off for key documents Non availability of required test data Delay in resolution of queries and clarifications regarding the project Test environment instability due to activities like upgrades or code fixes Test environment is not same as production environment Impact High Mitigation/ Contingencies Any changes to the scope of the project would be addressed through the standard Change Control procedures followed by L&T Infotech Careful planning, effective communication and efficient escalation process would mitigate this risk. SPOC to facilitate Test Data generation & test execution plan need to be reworked upon Queries/issues to be resolved at the earliest. An escalation process should also be in place to escalate unresolved issues to higher level in Project Management chain SPOC to ensure that only a functionally stable build is used for performance testing activity any instability would require rework on test execution plan SPOC to ensure that the performance test environment should be replica of production or
2. 3.
4.
High
5.
High
6.
Very High
15
HSB
Guidewire Claim Center mathematically scaled down version of production environment 7. Non-dedicated Environment for performance testing Unavailability of DBA Team, Development Team, or Infrastructure/Networking Team for performance isolation & fix Medium To have a dedicated performance testing environment where the application could be accessed only for performance testing activities Ensure the support/availability of DBA Team, Development Team or Infrastructure/Networking Team
8.
Very High
10. Assumptions
Requirements are base lined or approved prior to testing. Environment will be dedicated to performance testing during test execution and has been properly configured to support performance testing. Support (application development, database, deployment, and System Administrator teams) will be available during test runs to troubleshoot potential problems that may arise. Performance fixes are delivered according to defect management guidelines or agreed upon timeline HP LoadRunner is recommended for Performance Testing Any issue related to the application/network would have to be solved by respective application / network owners. L&T Infotech would only be responsible for LoadRunner related issues during the tenure of the engagement The resource loading is based on our current knowledge of the scope of work and understanding of requirements. Resource ramp up or change in scope (addition/removal) will be done after mutual agreement between L&T Infotech and HSB Performance Testing Lead & Performance Testers would travel to Hartford, CT office of HSB & work from onshore location for the duration of the project Any delay in availability of HSB resources, sign offs, access to environments, on boarding activities would potentially affect the project timelines and cost Deliverables to be signed off in 2 working days Stress Test would be executed with maximum of 500 concurrent users with failing criteria as more than 10% errors observed at client side Load Test would be executed with concurrent user load of 250 with failing criteria as more than 5 seconds of response time of a transaction (Transaction is a submit event) & CPU/RAM utilization of less than 50% Any delay in availability of the machines at specified locations would necessitate a revision of the project schedule Cincinanti, Ohio Hartfort, CT
16
HSB
11. Milestones
S. No. 1. 3. 4. 5. 6. Activities Completion of Performance Test Plan Conduct Shake out Test Completion of Performance Test Data Completion of Performance Test Scripts and Scenarios Conduct Performance Tests on Production Start Date 07/29/2013 07/29/2013 07/29/2013 07/30/2013 08/19/2013 End Date 08/09/2013 08/16/2013 08/16/2013 08/16/2013 09/06/2013 Owned By Performance Test Team Performance Test Team TBD Performance Test Team Performance Test Team
12. Stakeholders
Name Kathleen Kelting Randy Wallace Alok Tewari Jerold Adair Donna Senchesak Karen James Carolyn Clyde Lofton Walter Swiatlowski Ranjana Choudhary Venkateswara Rao Dodda Manav Garg Webb Douglas Ronald Foz Role Product Owner ISD Lead Project Manager Business Analyst Business Analyst Business Analyst Business Analyst Development Lead Test Lead Performance Test Lead Integration Development Lead Database Lead Configuration Development Lead
13.System Architecture
13.1 Test Environment The test environment consists of: Web Server - cvgs080055 App Server - cvgs080055 Batch Server - cvgs080091 DB server - cvgs070004 13.2 System Architecture (Production)
17
HSB
The Production environment is a shared environment consisting of: Load Balancer(s) - TBD Web Server(s) - TBD App Server(s) - cvgs080092, cvgs080093 Batch Server - cvgs080094 Database - cvgs080018, cvgs080019
18
HSB