0% found this document useful (0 votes)
5 views

SW Testing

Uploaded by

zahrina
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

SW Testing

Uploaded by

zahrina
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

SW Testing Overview

1 1
Outline
SW Testing
– Basics of testing
– Levels of software testing
– Testing techniques

V and V

QA, QC, Testing

Testing Method

2 2
Testing Definition

Testing can be defined as - A process of analyzing a


software item to detect the differences between
existing and required conditions (that is
defects/errors/bugs), and to evaluate the features of
the software item (ANSI/IEEE 1059 standard).

3
Goals of Testing

Identify errors
– Make errors repeatable (when do they occur?)
– Localize errors (where are they?)

The purpose of testing is to find problems in


programs so they can be fixed.

4
A Good Testing:

Has a reasonable probability of catching an error


Is not redundant
Is neither too simple nor complex
Reveals a problem
Is a failure if it doesn’t reveal a problem

5
Who Does Testing

Software Tester
Software Developer
Project Lead/Manager
End User
Another companies have different team, include
Software Quality Assurance Engineer, QA Analyst
etc

6
Levels of Software
Testing

Unit/Component testing
Integration testing
System testing
Acceptance testing
Installation testing

7
When Start Testing

1. Testing is done in different forms at every phase of


SDLC:

2. During the requirement gathering phase, the analysis


and verification of requirements are also considered as
testing.

3. Reviewing the design in the design phase with the


intent to improve the design is also considered as
testing.

4. Testing performed by a developer on completion of the


code is also categorized as testing.

8
When to StopTesting

1. Completion of test case execution


2. Completion of functional and code coverage
to a certain point
3. Bug rate falls below a certain level and no
high-priority bugs are identified
4. Management decision
5. Testing Deadlines

9
Levels of Software Testing

Unit/Component testing
– Verify implementation of each software element
– Trace each test to detailed design

Integration testing
System testing
Acceptance testing
Installation testing

10
Levels of Software Testing

Unit/Component testing
Integration testing
– Combine software units and test until the entire
system has been integrated
– Trace each test to high-level design

System testing
Acceptance testing
Installation testing

11
Levels of Software Testing

Unit/Component testing
Integration testing
System testing
– Test integration of hardware and software
– Ensure software as a complete entity complies with
operational requirements
– Trace test to system requirements

Acceptance testing
Installation testing

12
Levels of Software Testing

Unit/Component testing
Integration testing
System testing
Acceptance testing
– Determine if test results satisfy acceptance criteria of
project stakeholder
– Trace each test to stakeholder requirements

Installation testing

13
Levels of Software
Testing

Unit/Component testing
Integration testing
System testing
Acceptance testing
Installation testing
– Perform testing with application installed on its target
platform

14
Testing Phases: V-Model
Requirements System System Detailed
Specification Specification Design Design

System Sub-system Unit code and


Acceptance
Integration Integration Test
Test Plan
Test Plan Test Plan

Acceptance System Sub-system


Service
Test Integration test Integration test

15
Hierarchy of Testing
TESTING

Ad Hoc Program Testing System Testing Acceptance


Testing
Unit Testing Integration Function
Testing Benchmar
Black Box Properties
Top Down k
Performance Pilot
Equivalence
Bottom Up
Boundary Reliability Alpha
Big Bang
Decision Table Availability
Beta
State Transition Sandwich Security

Use Case Usability

Domain Analysis Documentation

White Box Portability

16 Control Flow Data Flow Capacity


Level SW Testing

17
VERIFICATION and
VALIDATION
(V and V)

18
What is V&V?

Verification
– Evaluation of an object to demonstrate that it meets its
specification. (Did we build the system right?)
– Evaluation of the work product of a development phase to
determine whether the product satisfies the conditions
imposed at the start of the phase.

Validation
– Evaluation of an object to demonstrate that it meets the
customer’s expectations. (Did we build the right system?)

19
V&V and Software Lifecycle
Throughout software lifecycle, e.g., V-model

20
Requirement Engineering

Determine general test strategy/plan (techniques,


criteria, team)
Test requirements specification
– Completeness
– Consistency
– Feasibility (functional, performance requirements)
– Testability (specific; unambiguous; quantitative; traceable)

Generate acceptance/validation testing data

21
Design

Determine system and integration test strategy


Assess/test the design
– Completeness
– Consistency
– Handling scenarios
– Traceability (to and from)
– Design walkthrough, inspection

22
Implementation and
Maintenance

Implementation
– Determine unit test strategy
– Techniques (static vs. dynamic)
– Tools, and whistles and bells (driver/harness, stub)

Maintenance
– Determine regression test strategy
– Documentation maintenance (vital!)

23
Hierarchy of V&V
Techniques

V&V

Dynamic Static Technique


Technique

Testing Symbolic Formal Informal


Execution Analysis Analysis

in narrow Model Static Walkthroug


sense
Checking Analysis h
Inspection
Proof
Complementar Reading
y

24
Definitions of V&V
Terms
“Correct” program and specification
– Program matches its specification
– Specification matches the client’s intent

Error (a.k.a. mistake)


– A human activity that leads to the creation of a fault
– A human error results in a fault which may, at runtime, result in
a failure

Fault (a.k.a. bug)


– The physical manifestation of an error that may result in a
failure
– A discrepancy between what something should contain (in order
for failure to be impossible) and what it does contain

Failure (a.k.a. symptom, problem, incident)


– Observable misbehavior

25
– Actual output does not match the expected output
Definitions

Fault identification and correction


–Process of determining what fault caused a
failure
–Process of changing a system to remove a fault
Debugging
– The act of finding and fixing program errors
Testing
–The act of designing, debugging, and executing
tests

26
Definitions

Test case and test set and test suite


–A particular set of input and the expected
output
–A finite set of test cases working together with
the same purpose
Test oracle
–Any means used to predict the outcome of a
testing

27
Sample Causes of Error

Incorrect or missing requirements


Requirements
Incorrect translation
System Design Incorrect design specification

Program Design Incorrect design interpretation

Incorrect documentation
Program Implementation
Incorrect semantics
Unit Testing
Incomplete testing

New faults introduced correcting


System Testing others
28
V&V Techniques

Requirements
Reviews:
walkthroughs/inspections
Design

Implementation Synthesis

Testing
Runtime monitoring
Operation
Model checking
Correctness proofs Maintenance

29
NON – FUNCTIONAL TESTING
TYPES

Objectives and Methods


Various Types Non-
Functional Testing

• Performance Testing
• Load Testing
• Stress Testing
• Compatibility Testing
• Security Testing
• Usability Testing
• Localization Testing

You might also like