Testing Methodology
Testing Methodology
Test Levels/Events
Unit Test
Technical Unit Test
Technical Unit Testing (TUT) focuses on the functionality of individually developed custom modified/added objects.
As the first line of defense in the testing process, unit testing aims to uncover problems early in the development
process. It ensures that objects, when tested individually, are error free before entering System Test. It also seeks to
determine that the object has achieved the intent of the design.
For each object, a Unit Test Script is created to validate that testing of the object satisfies the requirements specified
in the Functional/Technical Specification. Thus, technical unit testing verifies that development objects, or units
(e.g., customizations, interfaces, data conversion programs, reports) work as per their approved Functional/Technical
Specifications.
System Test
The System Test is the act of validating that the primary components of the system objects, transactional data, and
configuration values all work together to enable the system to work as designed.
The System Testing phase validates independent strings of development from a functional perspective. The
Project Testing Team expands the test conditions for System Testing and creates detailed system test scripts. The
system tester(s) revise and develop both positive and negative test scenarios. These test scenarios depict the
appropriate business activities and processes to be tested. This is, in essence, a more limited precursor of the
Integration Test events.
Testing will be performed by both Capgemini and CLIENT. This is accomplished by merging the objects that have
gone through Unit testing, the configuration, and converted data. This is in order to do as much pre-Integration
testing as soon as possible so that the Integration Testing events are more successful. The aim is to test as many
objects as are made available by development during testing. Each object has to be tested at least twice once preIntegration Testing and once during Integration Testing.
The system test is typically done in a full copy of the production environment. This ensures that the appropriate
configuration values are being tested, along with legacy data that is converted into the System Test environment (if
data is to be converted for production). The system test scripts are executed, using representative client data, to
validate that all three primary components the configuration, transactional data, and objects all operate together
correctly to produce the desired requirement for each process.
Application Security test is also tested at this level with real-life replicas of profile data.
The System test is performed by the Project Testing Team, in a QA-level environment.
Integration Test
The Integration Test is the final validation, prior to User Acceptance, that the system is operating as per the defined
requirements and any other items defined within the scope of the business requirements. The purpose of CRP is to
simulate end-to-end business processes. The team will enter transactions into the application, transfer data between
systems running completed interfaces, generate reports (standard and custom), and validate the conversion data. We
will also simulate payroll processing activities.
For example, testing will be done to validate that a new hire is paid correctly, thus verifying the integration of the
HR and Payroll applications. Another example of integration testing would be to test an interface that directly
supports the application (inbound) or an interface that is produced from a process within the application (outbound).
For each process (i.e. integration of functions), a corresponding Test Scenario is created. These scenarios define the
list of functional test scripts that combine to form the integration test for the process.
The scenarios for Integration Testing are typically organized by high-level process (e.g., Personal Data,
Performance Management and Payroll). During the Integration Test effort the following types of information will be
confirmed:
converted
Interfaces inbound to and outbound from PeopleSoft
Interfaced data inspection
Backup and recovery
Reconciliation and acceptance
Custom reports
Each developed object (gap) will be tested by the project teams as per the test scripts/scenarios written for each
gap. All test scripts have three (3) common approaches:
The steps and process for running the object being tested
Reconciliation and validation of employee data.
Reconciliation of a sample of individual records these tested records will be documented in
spreadsheet/table and the representative from the business involved in that test will sign-off
on it.
In the testing scenario, if an error occurs, then the program will be fixed and the testing would resume, but not
necessarily through the entire scenario. Based upon what the test is doing, the testing would restart at an appropriate
preceding step.
As in system test above, the tests will be conducted per the test scenarios and the results documented therein and in
the Test Scenario Status Report (details in the section below).
Integration Testing is performed by the Project Testing Team, with direct involvement from the CLIENT business
teams (in building the test scenarios, and in execution itself). The test is performed in the fully-connected QA
environment.
As mentioned in the above, a Payroll reconciliation test is performed in parallel to Integration Testing the payroll
info for a selected time span is being entered through the new system and reconciled against the existing data this
accomplishes the objectives of a parallel test without the extensive effort required for duplicate entry into two
systems.
Also, as part of Integration Testing, all security user profiles are exercised, by running the processes in scope for
each profile.
Regression Test
(not a separate event, but a method present at System and Integration test level)
A Regression Test is the act of validating that both delivered and customized objects and tables were correctly
included in the new upgraded pass. It is the testing of viewing menu objects and the ability to save tables without
receiving errors.
Regression Testing also validates that any of the newly migrated objects or tables have no effect on the data already
contained in the Production environment. The Regression testing team follows the same testing scenario and scripts
as that of the prior testing effort. The Regression test additionally serves as a validation check that the Production
Support team is comfortable with accepting then newly implemented phases into the Production environment.
The Project Testing team and the Application Maintenance Production Support team are jointly responsible for
Regression Testing.
Performance Test
Performance test is the act of determining or validating values for given performance metrics. Performance metrics
identify activities or system functions, the values that describe the expected performance of the function, and the
method that will be used to measure the value.
The Project team will be conducting network connectivity, response, and benchmarking testing as well as
application performance tests comprised of performance test, load test and stress tests. The Project testing team will
work with the CLIENT business teams to determine the criteria for these events, from which the SLAs and SLOs
will be derived.
The current assumption is that the benchmarking exercise, as well as the entire actual performance test event, will be
carried out using the HP Performance Center (a.k.a. LoadRunner) tool.
Describes the test scope (e.g. list of Use Cases covered, etc)
Describes the testing levels, types, methods
Describes overall testing processes and procedures
Describes overall testing documentation (metrics included)
Defines test environments
Defines testing tools
Defines overall testing roles and responsibilities
Defines overall testing schedule
Test Planning
TestPlansarecreated,byinterpretingtheTestingStrategyforeachspecificlevelafterUnitTest,refiningthe
following:
Scope of testing
Approach to be adopted (including methods, techniques, etc)
Roles and resources involved
Environments and data to be used
Tools to be used
The way in which features will be demonstrated
How results must be documented and communicated
Event entry/exit criteria, and suspension/resumption criteria
Detailed testing schedule
The Business Requirements lists and associated high-level test scenarios are part of the Test Plans. In other words,
the Test Plans contain the high-level requirements traceability matrices associated with a Test Level.
For unit test, we will not develop standalone test plans just test scripts (see below).
Test Design
The scope covered by the Test Plans is specified in Test Scripts, which contain detailed instructions on how to test a
certain test objective. Each test script will have a general description line which covers the objective under test.
The test script steps are supposed to contain all the data, system and environment specifications that are necessary in
order to run the test case. The input and the expected outputs (application, system, data output) are being specified
for each test step.
Depending on the test level, the test scripts may be extremely detailed or less detailed (e.g. in a web portal
application ST test case where a complex transaction is tested, one step may be fill out all fields on page A).
Test Execution
The designed test scripts are now executed.
The results are analyzed and corrective actions are taken, by diagnosing and correcting problems (via the defect
tracking tool), re-testing according to standards, and completing the relevant documentation. The defect fixing/retest cycle is iterative, and it repeats until the exit gates from the respective test level event, are being met. During the
event, test status reports are being published, and at the end of the event, a test summary report is published.
We currently assume that the following test execution service level agreements (SLAs) are also being observed:
Post Unit Test, Test Status Reports are produced daily, or at the conclusion of each build, and Test Summary Reports
are produced at the conclusion of each level of testing (e.g. the application has passed the Integration Test exit gates,
so now an Integration Test Summary Report is being published).
These reports typically contain:
Test case metrics (i.e. percentages of test cases passed, failed, on-hold pending defect fix, on-hold pending
functionality, on-hold pending data, not attempted, out of scope)
Defect logs and metrics. They contain statistics of the associated defects. Here are the proposed areas to measure:
Total number of defects found
Open defects by test event
Closed defects by test event
Detailed defect status by test event
Severity/Priority of defects found
Aging of open defects
Root cause of defects
Resolution type of closed defects
A Status Summary cover sheet.
Practically, the daily status report is the realization of the suite of test cases in execution for that particular day. The
above metrics may be combined into a single QA Scorecard matrix to allow the Program executives to better track
the overall status of the projects relative to product quality.
At the conclusion of the entire Testing Process, before the application goes live, the Master Test Summary Report
(summarizing the Test Process findings, and making proposals for future releases) is being published.
System Test
Deliverables
Unit Test Scripts
Entry Criteria
Development complete
for that object
Unit Test Execution and
Baselined Technical
Defect Weekly Status
Reports
Specifications
Exit Criteria
Requirements coverage
criteria met
All planned test scripts are
executed and passed
Discovered defects are
resolved, consistent with
the coverage criteria
Activities
System Test Planning
Deliverables
System Test Plan
Entry Criteria
Test stage is complete
Integration Planning
Integration test
Scenarios & Script
Preparation
Scripts Execution
Integration Scenarios
and Scripts
Integration Test
Execution and Defect
Daily Status Reports
Integration Test
Summary Report
Baselined Integration
Test Plan
Baselined Technical
Specifications
Environment is ready
Baselined Integration
Scenarios, Scripts, Test
Data, Traceability Matrix
Data Loading complete
Baselined Performance
Plan
Baselined NFRs and
Technical Specifications
Environment is ready
Base lined Scenarios,
Scripts, Test Data,
Traceability Matrix
Application stable (e.g.
start during SIT cycle 1,
when all S1 and S2
defects are fixed)
Design Automation
Framework (TBD)
Automate Scripts
(TBD)
Event sign Off
Baselined System
Scenarios, Scripts, Test
Data, Traceability Matrix
Data Loading complete
Exit Criteria
Requirements coverage
criteria met
All planned test scripts are
executed, and at least 90%
are passed
All Critical and Severe (S1
and S2) discovered
defects are resolved,
consistent with the
coverage criteria
Integration Test
Design Automation
Framework (TBD)
Automate Scripts
(TBD)
Event sign Off
Requirements coverage
criteria met
All planned test scripts are
executed and passed
Discovered defects are
resolved, consistent with
the coverage criteria
Performance Test
Performance Test
Planning
Scenarios & Script
Preparation
Scripts Execution
Requirements coverage
criteria met
All planned test scripts
are executed and
passed
Discovered defects are
resolved, consistent with
the coverage criteria
UAT
UAT Planning
UAT Plan
Requirements coverage
criteria met
All planned scripts are
executed and passed
Environment is ready