Testare Software
Testare Software
DEFINITION
Software quality is the degree of conformance to explicit or implicit
requirements and expectations.
Explanation:
Definition by ISTQB
As with any definition, the definition of software quality is also varied and
debatable. Some even say that quality cannot be defined and some say
that it can be defined but only in a particular context. Some even state
confidently that quality is lack of bugs. Whatever the definition, it is true
that quality is something we all aspire to.
Software quality has many dimensions.
In order to ensure software quality, we undertake Software Quality
Assurance and Software Quality Control.
CMMI
Six Sigma
ISO 9000
Note: There are many other models/standards for quality management but the ones
mentioned above are the most popular.
Software Quality Assurance encompasses the entire software development life cycle and
the goal is to ensure that the development and/or maintenance processes are continuously
improved to produce products that meet specifications/requirements.
The process of Software Quality Control (SQC) is also governed by Software Quality
Assurance (SQA).
SQA is generally shortened to just QA.
SOFTWARE QUALITY CONTROL Fundamentals
Software Quality Control (SQC) is a set of activities for ensuring quality in software
products.
It includes the following activities:
Reviews
o Requirement Review
o Design Review
o Code Review
o Deployment Plan Review
o Test Plan Review
o Test Cases Review
Testing
o Unit Testing
o Integration Testing
o System Testing
o Acceptance Testing
Software Quality Control is limited to the Review/Testing phases of the Software
Development Life Cycle and the goal is to ensure that the products meet
specifications/requirements.
The process of Software Quality Control (SQC) is governed by Software Quality
Assurance (SQA). While SQA is oriented towards prevention, SQC is oriented towards
detection. Read Differences between Software Quality Assurance and Software Quality
Control.
Some people assume that QC means just Testing and fail to consider Reviews; this should
be discouraged.
Differences between Software Quality Assurance (SQA) and Software Quality Control
(SQC):
Many people still use the term Quality Assurance (QA) and Quality Control (QC)
interchangeably but this should be discouraged.
Criteria
SQA is a set of activities for ensuring quality in SQC is a set of activities for
software engineering processes (that ultimately ensuring quality in software
Definition
result in quality in software products). The products. The activities focus on
activities establish and evaluate the processes identifying defects in the actual
that produce products.
products produced.
Focus
Process focused
Product focused
Orientatio
n
Prevention oriented
Detection oriented
Breadth
Organization wide
Product/project specific
Scope
Activities
Reviews
Audits
Training
Testing
User Documentation
System Testing
Acceptance Testing
Production Build/Deployment
Release
Maintenance
SDLC IN DETAIL
Project Planning
o Prepare
o Review
o Rework
o Baseline
o Revise [if necessary] >> Review >> Rework >> Baseline
Requirements Development [Business Requirements and Software/Product
Requirements]
o Develop
o Review
o Rework
o Baseline
o Revise [if necessary] >> Review >> Rework >> Baseline
Estimation [Size / Effort / Cost]
o <same as the activities/tasks mentioned for Project Planning>
Scheduling
o <same as the activities/tasks mentioned for Project Planning>
Prepare
Review
Rework
Baseline
Execute
Revise [if necessary] >> Review >> Rework >> Baseline >> Execute
Integration Testing
o <same as the activities/tasks mentioned for unit testing>
User Documentation
o Prepare
o Review
o Rework
o Baseline
o Revise [if necessary] >> Review >> Rework >> Baseline
System Testing
o <same as the activities/tasks mentioned for Unit Testing>
Acceptance Testing[ Internal Acceptance Test and External Acceptance Test]
o <same as the activities/tasks mentioned for Unit Testing>
Production Build/Deployment
o <same as the activities/tasks mentioned for Test Build/Deployment>
Release
o Prepare
o Review
o Rework
o Release
Maintenance
o Recode [Enhance software / Fix bugs]
o Retest
o Redeploy
o Rerelease
Notes:
The life cycle mentioned here is NOT set in stone and each phase does not
necessarily have to be implemented in the order mentioned.
Though SDLC uses the term Development, it does not focus just on the coding
tasks done by developers but incorporates the tasks of all stakeholders, including
testers.
There may still be many other activities/ tasks which have not been specifically mentioned
above, like Configuration Management. No matter what, it is essential that you clearly
understand the software development life cycle your project is following. One issue that is
widespread in many projects is that software testers are involved much later in the life cycle,
due to which they lack visibility and authority (which ultimately compromises software
quality).
DEFINITION OF TEST
Being in the software industry, we have to encounter the word TEST many times. Though
we have our own specific meaning of the word TEST, we have collected here some
definitions of the word as provided by various dictionaries and other tidbits. The word TEST
can be a Noun, a Verb or an Adjective but the definitions here are only of the Noun form.
DEFINITION
Google Dictionary:
A Test is a deliberate action or experiment to find out how well something works.
Cambridge Advanced Learners Dictionary:
A Test is an act of using something to find out whether it is working correctly or how effective
it is.
If the word TEST has been nauseating you because of its being overused, try the following
synonyms:
Analysis
Assessment
Attempt
Check
Confirmation
Evaluation
Examination
Experiment
Inquiry
Inspection
Investigation
Scrutiny
Trial
Verification
METHODOLOGIES of Software Testing
Below are some methods / techniques of software testing:
Agile Testing is a method of software testing that follows the principles of agile
software development.
These methods can be used in various software testing levels and types.
This method is named so because the software program, in the eyes of the tester, is like a
black box; inside which one cannot see. This method attempts to find errors in the following
categories:
Incorrect or missing functions
Interface errors
Errors in data structures or external database access
Behavior or performance errors
Initialization and termination errors
Definition by ISTQB
black box testing: Testing, either functional or non-functional, without reference to
the
internal structure of the component or system.
black box test design technique: Procedure to derive and/or select test cases
based on an
analysis of the specification, either functional or non-functional, of a component or
system
without reference to its internal structure.
EXAMPLE
A tester, without knowledge of the internal structures of a website, tests the web pages by
using a browser; providing inputs (clicks, keystrokes) and verifying the outputs against the
expected outcome.
LEVELS APPLICABLE TO
Black Box Testing method is applicable to the following levels of software testing:
Integration Testing
System Testing
Acceptance Testing
The higher the level, and hence the bigger and more complex the box, the more black box
testing method comes into use.
BLACK BOX TESTING TECHNIQUES
Following are some techniques that can be used for designing black box tests.
Equivalence partitioning: It is a software test design technique that involves dividing
input values into valid and invalid partitions and selecting representative values from
each partition as test data.
Boundary Value Analysis: It is a software test design technique that involves
determination of boundaries for input values and selecting values that are at the
boundaries and just inside/ outside of the boundaries as test data.
Cause Effect Graphing: It is a software test design technique that involves identifying
the cases (input conditions) and effects (output conditions), producing a CauseEffect Graph, and generating test cases accordingly.
BLACK BOX TESTING ADVANTAGES
Tests are done from a users point of view and will help in exposing discrepancies in
the specifications.
Tester need not know programming languages or how the software has been
implemented.
Tests can be conducted by a body independent from the developers, allowing for an
objective perspective and the avoidance of developer-bias.
EXAMPLE
A tester, usually a developer as well, studies the implementation code of a certain field on a
webpage, determines all legal (valid and invalid) AND illegal inputs and verifies the outputs
against the expected outcomes, which is also determined by studying the implementation
code.
White Box Testing is like the work of a mechanic who examines the engine to see why the
car is not moving.
LEVELS APPLICABLE TO
White Box Testing method is applicable to the following levels of software testing:
Unit Testing: For testing paths within a unit.
Integration Testing: For testing paths between units.
System Testing: For testing paths between subsystems.
However, it is mainly applied to Unit Testing.
WHITE BOX TESTING ADVANTAGES
Testing can be commenced at an earlier stage. One need not wait for the GUI to be
available.
Testing is more thorough, with the possibility of covering most paths.
WHITE BOX TESTING DISADVANTAGES
Since tests can be very complex, highly skilled resources are required, with thorough
knowledge of programming and implementation.
Test script maintenance can be a burden if the implementation changes too
frequently.
Since this method of testing it closely tied with the application being testing, tools to
cater to every kind of implementation/platform may not be readily available.
White Box Testing is contrasted with Black Box Testing. Read the Differences between
Black Box Testing and White Box Testing.
GRAY BOX TESTING Fundamentals
DEFINITION
Gray Box Testing is a software testing method which is a combination of Black Box
Testing method andWhite Box Testing method. In Black Box Testing, the internal structure of
the item being tested is unknown to the tester and in White Box Testing the internal
structure in known. In Gray Box Testing, the internal structure is partially known. This
involves having access to internal data structures and algorithms for purposes of designing
the test cases, but testing at the user, or black-box level.
Gray Box Testing is named so because the software program, in the eyes of the tester is
like a gray/ semi-transparent box; inside which one can partially see.
EXAMPLE
An example of Gray Box Testing would be when the codes for two units/ modules are
studied (White Box Testing method) for designing test cases and actual tests are conducted
using the exposed interfaces (Black Box Testing method).
LEVELS APPLICABLE TO
Though Gray Box Testing method may be used in other levels of testing, it is primarily useful
inIntegration Testing.
SPELLING
Note that Gray is also spelt as Grey. Hence Grey Box Testing and Gray Box Testing mean
the same.
AGILE TESTING Fundamentals
This article on Agile Testing assumes that you already understand Agile software
development methodology (Scrum, Extreme Programming, or other flavors of Agile). Also, it
discusses the idea at a high level and does not give you the specifics.
VERY SHORT DEFINITION
Agile Testing is a method of software testing that follows the principles of agile software
development.
does not mean that agile testing ignores processes and tools. In fact, agile testing is
built upon very simple, strong and reasonable processes like the process of
conducting the daily meeting or preparing the daily build. Similarly, agile testing
attempts to leverage tools, especially for test automation, as much as possible.
Nevertheless, it needs to be clearly understood that it is the testers who drive those
tools and the output of the tools depend on the testers (not the other way round).
Working software over comprehensive documentation: This means that
functional and usable software is valued over comprehensive but unusable
documentation. Though this is more directed to upfront requirement specifications
and design specifications, this can be true for test plans and test cases as well. Our
primary goal is the act of testing itself and not any elaborate documentation merely
pointing toward that goal. However, it is always best to have necessary
documentation in place so that the picture is clear and the picture remains with the
team if/ when a member leaves.
Customer collaboration over contract negotiation: This means that the client is
engaged frequently and closely in touch with the progress of the project (not through
complicated progress reports but through working pieces of software). This does put
some extra burden on the customer who has to collaborate with the team at regular
intervals (instead of just waiting till the end of the contract, hoping that deliveries will
be made as promised). But this frequent engagement ensures that the project is
heading toward the right direction and not toward the building of a frog when a fish is
expected.
Responding to change over following a plan: This means accepting changes as
being natural and responding to them without being afraid of them. It is always nice
to have a plan beforehand but it is not very nice to stick to a plan, at whatever the
cost, even when situations have changed. Lets say you write a test case, which is
your plan, assuming a certain requirement. Now, if the requirement changes, you do
not lament over the wastage of your time and effort. Instead, you promptly adjust
your test case to validate the changed requirement. And, of course, only a FOOL
would try to run the same old test case on the new software and mark the test as
FAIL.
PRINCIPLES BEHIND AGILE MANIFESTO
Behind the Agile Manifesto are the following principles which some agile practitioners
unfortunately fail to understand or implement. We urge you to go through each principle and
digest them thoroughly if you intend to embrace Agile Testing. On the right column, the
original principles have been re-written specifically for software testers.
We follow these principles:
Test Case ID
Test Case
Summary
Related
Requirement
Prerequisites
The test data, or links to the test data, that are to be used while
conducting the test.
The actual result of the test; to be filled after executing the test.
Status
Remarks
Created By
Date of
Execution
Test Environment
TS001
Test Case ID
TC001
Test Case
Summary
Related
Requirement
Prerequisites
RS001
1. User is authorized.
2. Coin balance is available.
1. Select the coin denomination in the Denomination field.
Test Procedure
Test Data
Expected Result
Actual Result
Status
Fail
Remarks
Created By
John Doe
Jane Roe
Date of
Execution
02/16/2020
Test
Environment
OS: Windows Y
Browser: Chrome N
WRITING GOOD TEST CASES
As far as possible, write test cases in such a way that you test only one thing at a
time. Do not overlap or complicate test cases. Attempt to make your test cases
atomic.
Ensure that all positive scenarios and negative scenarios are covered.
Language:
o Write in simple and easy to understand language.
o Use active voice: Do this, do that.
o Use exact and consistent names (of forms, fields, etc).
Characteristics of a good test case:
o Accurate: Exacts the purpose.
o Economical: No unnecessary steps or words.
o Traceable: Capable of being traced to requirements.
o Repeatable: Can be used to perform the test over and over.
o Reusable: Can be reused if necessary.
JavaScript
Perl
Python
Ruby
Tcl
VBScript
There are also many Test Automation Tools/Frameworks that generate the test scripts
for you; without the need for actual coding. Many of these tools have their own
scripting languages (some of them based on a core scripting languages). For example,
Sikuli, a GUI automation tool, uses Sikuli Script which is based on Python. A test script
can be as simple as the one below:
def sample_test_script (self):
type ("TextA")
click (ImageButtonA)
assertExist (ImageResultA)
A test execution engine and a repository of test scripts (along with test data) are
collectively known as a Test Harness.
Project
Project name.
Product
Product name.
Release Version
Module
Detected Build
Version
Build version of the product where the defect was detected (e.g.
1.2.3.5)
Summary
Description
Steps to Replicate
the steps.
Actual Result
The actual result you received when you followed the steps.
Expected Results
Attachments
Remarks
Defect Severity
Defect Priority
Reported By
Assigned To
Status
Fixed Build Version Build version of the product where the defect was fixed (e.g. 1.2.3.9)
REPORTING DEFECTS EFFECTIVELY
It is essential that you report defects effectively so that time and effort is not unnecessarily
wasted in trying to understand and reproduce the defect. Here are some guidelines:
Be specific:
o Specify the exact action: Do not say something like Select ButtonB. Do you
mean Click ButtonB or Press ALT+B or Focus on ButtonB and click
ENTER? Of course, if the defect can be arrived at by using all the three
ways, its okay to use a generic term as Select but bear in mind that you
might just get the fix for the Click ButtonB scenario. [Note: This might be a
highly unlikely example but it is hoped that the message is clear.]
o In case of multiple paths, mention the exact path you followed: Do not say
something like If you do A and X or B and Y or C and Z, you get D.
Understanding all the paths at once will be difficult. Instead, say Do A and X
and you get D. You can, of course, mention elsewhere in the report that D
can also be got if you do B and Y or C and Z.
o Do not use vague pronouns: Do not say something like In ApplicationA, open
X, Y, and Z, and then close it. What does the it stand for? Z or, Y, or X or
ApplicationA?
Be detailed:
o Provide more information (not less). In other words, do not be lazy.
Developers may or may not use all the information you provide but they sure
do not want to beg you for any information you have missed.
Be objective:
o Do not make subjective statements like This is a lousy application or You
fixed it real bad.
o Stick to the facts and avoid the emotions.
Reproduce the defect:
o Do not be impatient and file a defect report as soon as you uncover a defect.
Replicate it at least once more to be sure. (If you cannot replicate it again, try
recalling the exact test condition and keep trying. However, if you cannot
replicate it again after many trials, finally submit the report for further
investigation, stating that you are unable to reproduce the defect anymore
and providing any evidence of the defect if you had gathered. )
Review the report:
o Do not hit Submit as soon as you write the report. Review it at least once.
Remove any typos.
Software Defect / Bug: Definition, Explanation, Classification, Details:
DEFINITION
A Software Defect / Bug is a condition in a software product which does not meet a software
requirement (as stated in the requirement specifications) or end-user expectations (which
may not be specified but are reasonable). In other words, a defect is an error in coding or
logic that causes a program to malfunction or to produce incorrect/unexpected results.
A program that contains a large number of bugs is said to be buggy.
Reports detailing bugs in software are known as bug reports. (See Defect Report)
Software Testing proves that defects exist but NOT that defects do not exist.
CLASSIFICATION
Software Defects/ Bugs are normally classified as per:
Severity / Impact (See Defect Severity)
Probability / Visibility (See Defect Probability)
Priority / Urgency (See Defect Priority)
Related Dimension of Quality (See Dimensions of Quality)
Related Module / Component
Phase Detected
Phase Injected
Related Module /Component
Related Module / Component indicates the module or component of the software where the
defect was detected. This provides information on which module / component is buggy or
risky.
Module/Component A
Module/Component B
Module/Component C
Phase Detected
Phase Detected indicates the phase in the software development lifecycle where the defect
was identified.
Unit Testing
Integration Testing
System Testing
Acceptance Testing
Phase Injected
Phase Injected indicates the phase in the software development lifecycle where the bug
was introduced. Phase Injected is always earlier in the software development lifecycle than
the Phase Detected. Phase Injected can be known only after a proper root-cause analysis
of the bug.
Requirements Development
High Level Design
Detailed Design
Coding
Build/Deployment
Note that the categorizations above are just guidelines and it is up to the
project/organization to decide on what kind of categorization to use. In most cases, the
categorization depends on the defect tracking tool that is being used. It is essential that
project members agree beforehand on the categorization (and the meaning of each
categorization) so as to avoid arguments, conflicts, and unhealthy bickering later.
NOTE: We prefer the term Defect over the term Bug because Defect is more
comprehensive.
The Differences Between Black Box Testing and White Box Testing are listed below.
Criteria
Definition
System Testing
Integration Testing
Responsibility
Programming
Knowledge
Not Required
Required
Implementation
Not Required
Knowledge
Required
Detail Design
Levels
Applicable To
Requirement Specifications
For a combination of the two testing methods, see Gray Box Testing.
Note: Job Descriptions might vary significantly from projects to projects or companies to
companies. And, mostly, one is required to do much more than what is stated in the Job
Description.
Designation
Senior Software Tester
Department
Quality Control
Line Manager
Software Test Manager
Job Aim
To supervise Software Testers and perform tests to ensure quality
Responsibilities
Supervision of Software Testers
Review of software requirements
Preparation/review of test plans
Preparation/review of test cases
Execution of tests
Reporting of defects
Preparation of test reports
Essential Skills
Proficiency in written and spoken English
Organized
Detailed
Skeptical
Advanced knowledge of computers
Teamwork
Desired Skills
Designation
Software Tester
Department
Quality Control
Line Manager
Senior Software Tester
Job Aim
To perform software tests to ensure quality
Responsibilities
Review of software requirements
Preparation of test cases
Execution of tests
Reporting of defects
Test Case ID
Test Case
Summary
Related
Requirement
Prerequisites
The test data, or links to the test data, that are to be used while
conducting the test.
The actual result of the test; to be filled after executing the test.
Status
Remarks
Created By
Date of
Execution
Test Environment
TS001
Test Case ID
TC001
Test Case
Summary
Related
Requirement
RS001
Prerequisites
1. User is authorized.
2. Coin balance is available.
Test Data
Expected Result
Actual Result
Status
Fail
Remarks
Created By
John Doe
Jane Roe
Date of
Execution
02/16/2020
Test
Environment
OS: Windows Y
Browser: Chrome N
Language:
o Write in simple and easy to understand language.
o Use active voice: Do this, do that.
o Use exact and consistent names (of forms, fields, etc).
Characteristics of a good test case:
o Accurate: Exacts the purpose.
o Economical: No unnecessary steps or words.
o Traceable: Capable of being traced to requirements.
o Repeatable: Can be used to perform the test over and over.
o Reusable: Can be reused if necessary.
25. User should be able to select only one radio option and any combination for
check boxes.
Test Scenarios for Filter Criteria
1. User should be able to filter results using all parameters on the page
2. Refine search functionality should load search page with all user selected search
parameters
3. When there is at least one filter criteria is required to perform search operation,
make sure proper error message is displayed when user submits the page without
selecting any filter criteria.
4. When at least one filter criteria selection is not compulsory user should be able
to submit page and default search criteria should get used to query results
5. Proper validation messages should be displayed for invalid values for filter
criteria
Test Scenarios for Result Grid
1. Page loading symbol should be displayed when its taking more than default time
to load the result page
2. Check if all search parameters are used to fetch data shown on result grid
3. Total number of results should be displayed on result grid
4. Search criteria used for searching should be displayed on result grid
5. Result grid values should be sorted by default column.
6. Sorted columns should be displayed with sorting icon
7. Result grids should include all specified columns with correct values
8. Ascending and descending sorting functionality should work for columns
supported with data sorting
9. Result grids should be displayed with proper column and row spacing
10. Pagination should be enabled when there are more results than the default
result count per page
11. Check for Next, Previous, First and Last page pagination functionality
12. Duplicate records should not be displayed in result grid
13. Check if all columns are visible and horizontal scroll bar is enabled if necessary
14. Check data for dynamic columns (columns whose values are calculated
dynamically based on the other column values)
15. For result grids showing reports check Totals row and verify total for every
column
16. For result grids showing reports check Totals row data when pagination is
11. Check if file selection dialog shows only supported files listed
12. Check multiple images upload functionality
13. Check image quality after upload. Image quality should not be changed after
upload
14. Check if user is able to use/view the uploaded images
Test Scenarios for Sending Emails
(Test cases for composing or validating emails are not included)
(Make sure to use dummy email addresses before executing email related tests)
1. Email template should use standard CSS for all emails
2. Email addresses should be validated before sending emails
3. Special characters in email body template should be handled properly
4. Language specific characters (e.g. Russian, Chinese or German language
characters) should be handled properly in email body template
5. Email subject should not be blank
6. Placeholder fields used in email template should be replaced with actual values
e.g. {Firstname} {Lastname} should be replaced with individuals first and last
name properly for all recipients
7. If reports with dynamic values are included in email body, report data should be
calculated correctly
8. Email sender name should not be blank
9. Emails should be checked in different email clients like Outlook, Gmail, Hotmail,
Yahoo! mail etc.
10. Check send email functionality using TO, CC and BCC fields
11. Check plain text emails
12. Check HTML format emails
13. Check email header and footer for company logo, privacy policy and other links
14. Check emails with attachments
15. Check send email functionality to single, multiple or distribution list recipients
16. Check if reply to email address is correct
17. Check sending high volume of emails
Test Scenarios for Excel Export Functionality
1. File should get exported in proper file extension
2. File name for the exported Excel file should be as per the standards e.g. if file
name is using timestamp, it should get replaced properly with actual timestamp at
the time of exporting the file
3. Check for date format if exported Excel file contains date columns
4. Check number formatting for numeric or currency values. Formatting should be
same as shown on page
5. Exported file should have columns with proper column names
6. Default page sorting should be carried in exported file as well
7. Excel file data should be formatted properly with header and footer text, date,
page numbers etc. values for all pages
8. Check if data displayed on page and exported Excel file is same
9. Check export functionality when pagination is enabled
10. Check if export button is showing proper icon according to exported file type
e.g. Excel file icon for xls files
11. Check export functionality for files with very large size
12. Check export functionality for pages containing special characters. Check if
these special characters are exported properly in Excel file
Performance Testing Test Scenarios
1. Check if page load time is within acceptable range
2. Check page load on slow connections
3. Check response time for any action under light, normal, moderate and heavy
load conditions
4. Check performance of database stored procedures and triggers
5. Check database query execution time
6. Check for load testing of application
7. Check for stress testing of application
8. Check CPU and memory usage under peak load condition
Security Testing Test Scenarios
1. Check for SQL injection attacks
2. Secure pages should use HTTPS protocol
3. Page crash should not reveal application or server info. Error page should be
displayed for this
4. Escape special characters in input
5. Error messages should not reveal any sensitive information
6. All credentials should be transferred over an encrypted channel
7. Test password security and password policy enforcement
8. Check application logout functionality
9. Check for Brute Force Attacks
Testare software
n 1983, Biroul Naional de Standarde din Statele Unite ale Americii public un set
de practici adresate activitilor de verificare, validare i testare a programelor de
calculator. Aceast metodologie, adresat n mod specific instituiilor americane
federale, cuprinde metode de analiz, evaluare i testare care s fie aplicate de-a
lungul ciclului de via al aplicaiei. Ghidul de bune practici sugereaz alegerea unor
diverse metode de verificare i validare, n funcie de caracteristicile fiecrui proiect
n scopul creterii calitii generale a produsului. n anii '70 nivelul de profesionalism
al persoanelor implicate n activitatea de testare a crescut simitor. Apar posturile
dedicate de tester, manager de teste sau analist de teste. Apar de asemenea
organizaii profesionale ale celor care activeaz n domeniul testrii software,
precum i publicaii specializate, cri i articole de specialitate. Mai important,
instituiile americane ANSI i IEEE ncep elaborarea unor standarde care s
formalizeze procesul de testare, efort concretizat n standarde precum ANSI IEEE
STD 829, n 1983, care stabilea anumite formate care s fie utilizate pentru crearea
documentaiei de testare.
acestui exerciiu sunt (1) codul face lucrul care este presupus sa-l fac i (2) codul
face lucrul care trebuie sa-l fac. Informaia obinut n urma procesului de testare
poate fi folosit pentru corectarea i mbuntirea procesului conform cruia se
dezvolt produsul software.[8]
Defecte i euri
Nu toate defectele software sunt cauzate de greeli n cod. O surs rspndit de
defecte costisitoare sunt lacunele i neclaritile la nivel de cerine, de exemplu,
cerinele "ascunse" sau incomplete pot s rezulte ntr-un set de erori introduse nc
n faza de proiectare de ctre designerul programului. [9] Cerinele nonfuncionale precum ar
fitestabilitatea, scalabilitatea, mentenabilitatea, usabilitatea, performana i securit
atea, sunt o surs raspndit de astfel de erori.
Defectele software se manifest ca rezultat al urmtorului proces: un programator
comite o eroare (greeal), care la rndul ei rezult ntr-un defect (bug) la nivel
de codul sursal programului; dac acest defect este executat, n anumite condiii
sistemul va produce un rezultat greit, ceea ce ulterior va duce la o euare a
programului.[10] Nu toate defectele pot duce la euarea programului. De exemplu,
defectele ce se conin ntr-o seciune de cod "mort" niciodat nu vor duce la
euarea programului. Defectele se pot manifesta ca euri la schimbarea
mprejurimii n care ruleaz programul. Exemplele de astfel de modificri includ:
trecerea pe o platform hardware nou, alterri n sursa de date sau interaciunea
cu software diferit.[10] Un singur defect poate rezulta ntr-un spectru larg de
simptome prin care se manifest cderile.
Compatibilitatea
Deseori aplicaiile software cad din cauza problemelor de compatibilititate cauzate
att de interaciunea lor cu alte aplicaii sau sisteme de operare, ct i de nonconformitile ce apar de la o versiune a programului la alta ntr-un proces
inceremental de dezvoltare a produsului. Incompatibilitile ce apar ntre versiuni se
datoreaz faptului c la momentul scrierii codului programatorul a considerat, sau a
testat, produsul doar pentru un singur sistem de operare (sau un set restrns de
sisteme de operare), far a lua n calcul problemele ce pot aprea la schimbarea
contextului de execuie. Un rezultat nedorit al acestui fapt poate fi urmtorul: ultima
versiune a programului poate s nu mai fie compatibil cu acea combinaie de
software/hardware folosit mai devreme, sau poate s nu mai fie compatibil cu un
alt sistem, compatibilitate extrem de important. Deci, testarea de compatibilitate
este o "strategie orientat spre prevenire", fapt ce o asociaz clar cu ultima din
fazele de testare propuse de Dave Gelperin i William C. Hetzel [6].
Combinri de date de intrare i precondiii
O problema fundamental a testrii software a fost i rmne imposibilitatea de a
testa un produs utiliznd toate combinaiile posibile de date de intrare i precondiii
(starea iniial). Aceasta este adevrat chiar i n cazul produselor simple
What are 5 common problems in the software development process?
unrealistic schedule - if too much work is crammed in too little time, problems
are inevitable.
inadequate testing - no one may know whether or not the software is any
good until customers complain or systems crash.
solid requirements/user stories - clear, complete, appropriately detailed, cohesive, attainable, testable
specifications that are agreed to by all players. In 'agile'-type environments, continuous close
coordination with product owners or their representatives is necessary to ensure that
changing/emerging requirements are understood.
realistic schedules - allow adequate time for planning, design, testing, bug fixing, re-testing, changes,
and documentation; personnel should be able to complete the project without burning out, and be
ableto work at a sustainable pace.
adequate testing - start testing early on, re-test after fixes or changes, plan for adequate time for
testing and bug-fixing. 'Early' testing could include static code analysis/testing, test-first development,
unit testing by developers, built-in testing and diagnostic capabilities, etc. Automated testing can
contribute significantly if effectively designed and implemented as part of an overall testing strategy.
stick to initial requirements where feasible - be prepared to defend against excessive changes and
additions once development has begun, and be prepared to explain consequences. If changes are
necessary, they should be adequately reflected in related schedule changes. If possible, work closely
with customers/end-users to manage expectations. In agile environments, requirements may change
often, requiring that true agile processes be in place and followed.
use descriptive function and method names - use both upper and lower case,
avoid abbreviations, use as many characters as necessary to be adequately
descriptive (use of more than 20 characters is not out of line); be consistent
in naming conventions.
use descriptive variable names - use both upper and lower case, avoid
abbreviations, use as many characters as necessary to be adequately
descriptive (use of more than 20 characters is not out of line); be consistent
in naming conventions.
function and method sizes should be minimized; less than 100 lines of code is
good, less than 50 lines is preferable.
in adding comments, err on the side of too many rather than too few
comments; a common rule of thumb is that there should be at least as many
lines of comments (including header blocks) as lines of code.
make extensive use of error handling procedures and status and error
logging.
for C++, keep class methods small, less than 50 lines of code per method is
preferable.
the program should act in a way that least surprises the user
it should always be evident to the user what can be done next and how to exit
the program shouldn't let the users do something stupid without warning them.
SEI = 'Software Engineering Institute' at Carnegie-Mellon University; initiated by the U.S. Defense
Department to help improve software development processes.
CMM = 'Capability Maturity Model', now called the CMMI ('Capability Maturity Model Integration'),
developed by the SEI and as of January 2013 overseen by the CMMI Institute at Carnegie Mellon
University. In the 'staged' version, it's a model of 5 levels of process 'maturity' that help determine
effectiveness in delivering quality software. CMMI models are "collections of best practices that help
organizations to improve their processes." It is geared to larger organizations such as large U.S.
Defense Department contractors. However, many of the QA processes involved are appropriate to any
organization, and if reasonably applied can be helpful. Organizations can receive CMMI ratings by
undergoing assessments by qualified auditors. CMMI V1.3 (2010) also supports Agile development
processes.
ISO = 'International Organisation for Standardization' - The ISO 9001:2015 standard (the latest in the
periodically-updated ISO standard) concerns quality systems that are assessed by outside auditors,
and it applies to many kinds of production and manufacturing organizations, not just software. It covers
documentation, design, development, production, testing, installation, servicing, and other processes.
The full set of standards consists of: (a)ISO 9001-2015 - Quality Management Systems:
Requirements; (b)ISO 9000-2015 - Quality Management Systems: Fundamentals and Vocabulary;
(c)ISO 9004-2009 - Quality Management Systems: Guidelines for Performance Improvements. (d)ISO
19011-2011 - Guidelines for auditing management systems. To be ISO 9001 certified, a third-party
auditor assesses an organization, and certification is typically good for about 3 years, after which a
complete reassessment is required. Note that ISO certification does not necessarily indicate quality
products - it indicates only that documented processes are followed. There are also other softwarerelated ISO standards such as ISO/IEC 25010:2011 which includes a 'quality in use model' composed
of five characteristics and a 'product quality model' that covers eight main characteristics of software.
Also see http://www.iso.org/ for the latest information. In the U.S. the standards can also be purchased
via the ASQ web site at http://asq.org/quality-press/
ISO/IEC 25010 is a software quality evaluation standard that defines (a) a 'quality in use model' of five
characteristics that relate to the outcome of interaction when a product is used in a particular context of
use, and (b) a 'product quality model' composed of eight characteristics that relate to static properties
of software and dynamic properties of the computer system.
ISO/IEC/IEEE 29119 series of standards for software testing.
ISO/IEC/IEEE 29119-1: Concepts & Definitions (published Sept. 2013)
ISO/IEC/IEEE 29119-2: Test Processes (published Sept. 2013)
ISO/IEC/IEEE 29119-3: Test Documentation (published Sept. 2013)
ISO/IEC/IEEE 29119-4: Test Techniques (expected publication late 2014)
ISO/IEC/IEEE 29119-5: Keyword Driven Testing (expected publication 2015)
IEEE = 'Institute of Electrical and Electronics Engineers' - among other things, creates standards such
as 'IEEE Standard for Software Test Documentation' (IEEE/ANSI Standard 829), 'IEEE Standard of
Software Unit Testing (IEEE/ANSI Standard 1008), 'IEEE Standard for Software Quality Assurance
Plans' (IEEE/ANSI Standard 730), and others.
ANSI = 'American National Standards Institute', the primary industrial standards body in the U.S.;
publishes some software-related standards in conjunction with the IEEE and ASQ (American Society
for Quality).
SOFTWARE TESTING JOKES: The following jokes related to software testing have
been compiled from forwarded emails and internet resources. Thanks to the ones
who thought of them first.
struggling to get the correct measurement; dropping the tape measures and falling
off the ladders.
A tester comes along and sees what theyre trying to do, walks over, pulls down the
flagpole, lays it flat, measures it from end to end, gives the measurement to one of
the managers and walks away.
After the tester is gone, one manager turns to another and laughs, Isnt that just
like a tester? Were looking for the height and he gives us the length.
Damage Testing
The Aviation Department had a unique device for testing the strength of windshields
on airplanes. The device was a gun that launched a dead chicken at a planes
windshield at approximately the speed the plane flies. The theory was that if the
windshield does not crack from the impact of the chicken, it will survive a real
collision with a bird during flight.
The Railroad Department heard of this device and decided to use it for testing a
windshield on a locomotive they were developing.
So the Railroad Department borrowed the device, loaded a chicken and fired at the
windshield of the locomotive. The chicken not only shattered the windshield but also
went right through and made a hole on the back wall of the engine cab the
unscathed chickens head popping out of the hole. The Railroad Department was
stunned and contacted the Aviation Department to recheck the test to see if
everything was done correctly.
The Aviation Department reviewed the test thoroughly and sent a report. The report
consisted of just one recommendation and it read Use a thawed chicken.
A Testers Courage
The Director of a software company proudly announced that a flight software
developed by the company was installed in an airplane and the airlines was offering
free first flights to the members of the company. Who are interested? the Director
asked. Nobody came forward. Finally, one person volunteered. The brave Software
Tester stated, I will do it. I know that the airplane will not be able to take off.
Light Bulb
Question: How many testers does it take to change a light bulb?
Answer: None. Testers do not fix problems; they just find them.
Question: How many programmers does it take to change a light bulb?
Answer1: Whats the problem? The bulb at my desk works fine!
Answer2: None. Thats a hardware problem.
The Glass
Testing Definition
To tell somebody that he is wrong is called criticism. To do so officially is called
testing.
Words
Developer: There is no I in TEAM
Tester: We cannot spell BUGS without U
Experience Counts
There was a software tester who had an exceptional gift for finding bugs. After
serving his company for many years, he happily retired. Several years later, the
company contacted him regarding a bug in a multi-million-dollar application which
no one in the company was able to reproduce. They tried for many days to replicate
the bug but without success.
In desperation, they called on the retired software tester and after much persuasion
he reluctantly took the challenge.
He came to the company and started studying the application. Within an hour, he
provided the exact steps to reproduce the problem and left. The bug was then fixed.
Later, the company received a bill for $50,000 from the software tester for his
service. The company was stunned with the exorbitant bill for such a short duration
of service and demanded an itemized accounting of his charges.
The software tester responded with the itemization:
Bug Report: $1
Sandwich
Two software testers went into a diner and ordered two drinks. Then they produced
sandwiches from their briefcases and started to eat. The owner became quite
concerned and marched over and told them, You cannot eat your own sandwiches
in here!
The testers looked at each other, shrugged their shoulders and then exchanged
sandwiches.
Your love letters get returned to you marked up with red ink, highlighting your
grammar and spelling mistakes.
When you tell him that you wont change something he has asked you to
change, hell offer to allow you two other flaws in exchange for correcting this
one.
When you ask him how you look in a dress, hell actually tell you.
When you give him the Its not you, its me breakup line, hell agree with
you and give the specifics.
He wont help you change a broken light bulb because his job is simply to
report and not to fix.
Hell keep bringing up old problems that youve since resolved just to make
sure that theyre truly gone.
Who Is Who
A Project Manager is the one who thinks 9 women can deliver a baby in 1
month.
An Onsite Coordinator is the one who thinks 1 woman can deliver 9 babies in
1 month.
A Developer is the one who thinks it will take 18 months to deliver 1 baby.
A Marketing Manager is the one who thinks he can deliver a baby even if no
man and woman are available.
A Tester is the one who always tells his wife that this is not the right baby.
Programmer Responses
Some sample replies that you get from programmers when their programs do not
work:
It worked yesterday.
Assessment Of An Opera
A CEO of a software company was given a ticket for an opera. Since he was unable
to go, he passed the invitation to the companys Quality Assurance Manager.
The next morning, the CEO asked him how he enjoyed it, and he was handed a
report, which read as follows:
For a considerable period, the oboe players had nothing to do. Their number should
be reduced, and their work spread over the whole orchestra, thus avoiding peaks of
inactivity. All twelve violins were playing identical notes. This seems unnecessary
duplication, and the staff of this section should be drastically cut. If a large volume
of sound is really required, this could be obtained through the use of an amplifier.
Much effort was involved in playing the demi-semiquavers. This seems an excessive
refinement, and it is recommended that all notes be rounded up to the nearest
semiquaver. No useful purpose is served by repeating with horns the passage that
has already been handled by the strings. If all such redundant passages were
eliminated, the concert could be reduced from two hours to twenty minutes.
The Search
Under a streetlight, on a very dark night, a software tester was looking for a set of
lost keys.
A policeman came by, asked him about the object of his search, and joined him to
help. After the two had searched for some time, the policeman asked, Are you sure
you lost them here?
Oh, no, said the software tester. I lost the keys somewhere else.
Then why are you looking for them over here? the policeman asked.
Because this is where the light is! the software tester replied.
Moral: Do not be so stupid that you search for bugs only at the obvious places.
Disney Password
A person with a developer background was hired as a software tester and assigned
to a Disney website project. On reviewing his test data for the login feature, it was
found that he had MickeyDonaldGoofyPluto for the password field. Amused, his
manager asked him why.
It says the password needs to have at least four characters. he replied.
Food Testing
EGGS: When something starts pecking its way out of the shell, the egg is
probably past its prime.
DAIRY PRODUCTS: Milk is spoiled when it starts to look like yogurt. Yogurt is
spoiled when it starts to look like cottage cheese. Cottage cheese is spoiled
when it starts to look like regular cheese. Regular cheese is nothing but
spoiled milk anyway and cant get any more spoiled than it is already.
MEAT: If opening the refrigerator door causes stray animals from a threeblock radius to congregate outside your house, the meat is spoiled.
BREAD: Fuzzy and hairy looking white or green growth areas are a good
indication that your bread has turned into a pharmaceutical laboratory
experiment.
CANNED GOODS: Any canned goods that have become the size or shape of a
softball should be disposed of.
GENERAL RULE OF THUMB: Most food cannot be kept longer than the average
life span of a hamster. Keep a hamster in or nearby your refrigerator to gauge
this.
Tickle Me Toys
There is a factory that makes Tickle Me toys. The toy laughs when you tickle it
under the arms.
Jane is hired at the factory and she reports for her first day promptly. The next day
there is a knock at the Personnel Managers door. The Foreman throws open the
door and begins to rant about the new employee. He complains that she is
incredibly slow and the whole line is delayed, putting the entire production line
behind schedule.
The Personnel Manager decides he should see this for himself, so the two men
march down to the factory floor. When they get there the line is so backed up that
there are Tickle Me toys all piling up. At the end of the line stands a nervous Jane
surrounded by mountains of Tickle Me toys.
She has a roll of thread and a huge bag of small marbles. The two men watch in
amazement as she cuts a little piece of thread, wraps it around two marbles and
begins to carefully sew the little package between the toys legs.
The Personnel Manager bursts into laughter. After several minutes of hysterics he
pulls himself together, approaches Jane and says Im sorry, Jane, but I think you
misunderstood the instructions I gave you yesterday. Your job was to give the toys
two test tickles each.
Developers provide the last build with the fixes to 2 critical bugs.
Testers run a smoke test and find that a major feature is missing. Normally,
the build is not accepted if the smoke test fails, but they continue.
It is found that one of the critical bugs is not fixed. Instead 3 more minor bugs
have been introduced. Luckily, another critical bug is verified as fixed.
07:00PM
Developers argue that the 3 minor bugs are not bugs but enhancement
requests and that the missing major feature will not be noticed by the endusers.
Project Manager says that they will be mentioned as known issues but the
critical bug needs to be fixed anyhow.
10:00PM
Developers provide the really last build with the fix for the critical bug and
go home.
Testers have no time to run smoke test or regression tests. They just verify
the fix for the critical bug.
11:00PM
11:58PM
Next Day
Guess what!
8:00 AM: Reach office; Smile at everyone; Finish morning office chores
11:00 AM: Test; Argue with them; Randomly click the mouse button multiple
times
4:00 PM: Receive the so-called-final build; Curse for the delay; Conduct Sulk
Test
5:00 PM: Test; Protest against the decision to make a release despite such
poor quality
There are many other specialized positions in both Manual Testing, Automated
Testing and Test Management.
What much will I earn?
Yes, money does matter and this field does pay well (not very very well but well
enough). The following sites provide you with a rough idea. However, remember
that money is secondary to self-satisfaction.
ITJobsWatch.co.uk
SoftwareTestingInstitute.com
Indeed.com
Payscale.com
HotJobs.Yahoo.com
QAJobs.net
SoftwareTestingJobs.net
SoftwareTestingJobs.com
TestingJobz.com
Indeed.com
JobCentral.com
Monster.com
HotJobs.Yahoo.com
Jobs.com
ITJobs.com
ITJobs.net
ComputerJobs.com
If the management of the company is not very much concerned about quality
or does not know much about it, then you and your team might have a tough
time.
If you and your team will have to rely on developers for everything, then
there is something wrong. Instead, the developers should be relying on you
for product knowledge, requirements understanding etc.
If the company does not have some sort of quality assurance in place, your
efforts on quality control will be either worthless or extremely painful.
If you or the company thinks that software testing is for the non-performers,
then all of you are doomed.
If you have poor communication skills (you will need to do a lot more of
writing and arguing than you can imagine)
If you do not like minute work (you will have to investigate things through a
magnifying glass while keeping a birds eye view at the same time)
If you are impatient and wish to see results every minute or so (you will have
to do a lot of waiting and cursing in this field)
If you did not read this article word by word (If this sentence caught your eye
while you were skimming, then your satisfaction score in a software testing
job will be just 2.5 out of 100)
Purpose: The purpose of your CV is not to get you the job but an interview.
When competition is high, the quality of your CV determines your destiny.
Perspective: You are not writing a CV for yourself but for the reader. So, write
what the intended reader needs and wishes to see in your CV.
Honesty: Do not claim something you do not possess. You might be offered an
interview and then the job as well but you will not survive for long. Moreover,
you will live and die guilty.
Achievements: Make them short statements and relate them to the job you
are applying for if you can and if true.
Employment Details: Begin with your current or most recent job and work
backwards.
Hobbies: No need to mention this, especially if you are into stamp collection.
Length: About 2 pages is fine. People do not have the time and patience for
tomes these days.
Margins: Neither too deep nor too narrow. About 1 inch is fine.
Fonts: Do not use too many. Preferably, use a single font. Two at the max.
Font Type: Do not use fancy/uncommon fonts. Use common fonts such as
Arial or Times New Roman.
Text Size: Do not use too large or too small size. 10pt to 12pt is fine.
Color: Use a single color, preferably black. Or else, two colors at the max. Do
not use shades; they do not print well.
Let your CV shine and you are already a step closer to your software testing job.
Good Luck!