SlideShare a Scribd company logo
4
Most read
9
Most read
11
Most read
TEST AUTOMATION
Ganesh Pagade
Introduction
 Automation – Testing which can be done
programmatically.
 Far more efficient than manual testing
 More complex than it appears
Testing and Automation are Different
Economic Evolvable
Exemplary
Effective
Automated
tests (after
many runs) Manual
test
First run of
automated test
Promises of Test Automation
 Run existing tests on a new version of a program
 Run more tests more often
 Perform tests which would be difficult or impossible
to do manually
 Better use of resources
Promises of Test Automation…
 Consistency and repeatability of tests
 Reuse of tests
 Earlier time to market
 Increased confidence
Automation and Agile
Development
Team
Create
Application Code
Create Tests
Automate Tests
QA Testers
Automation
Team
Create
Application Code
Create Tests
Automate Tests
Create
Application Code
Create Tests
Automate Tests
Start of
Development
End of
Iteration 1
End of
Iteration 2
End of
Iteration 3
Continuous Integration Continuous Integration Continuous Integration
What to Automate?
Identify
Design
Build
Execute
Check
Governs the quality of tests
Good to automate
Intellectual
Clerical
Scripting
 Test Script – data and/or instructions with a formal
syntax, used by a test execution automation tool,
typically held in a file.
 Writing scripts is much like writing a computer
program.
 Reduce amount of scripting.
Attributes of a Script Set
 Number of Scripts
 Size of Scripts
 Function
 Documentation
 Reuse
 Structured
 Maintenance
Automated Comparison
 Verification by comparison
 Dynamic comparison
 Post-execution comparison
 Integration of test execution and post-execution
comparison
Automated Comparison…
Start test tool,
select and run
test cases
Determine
test case
success or
failure
Run
comparator(s)
Determine
post-execution
comparison
success or
failure
Run test cases
(including
any dynamic
comparisons)
Perform
comparisons
Manual Tasks
Tool Tasks
Automated Comparison…
Start test tool,
select and run
test cases
Determine
test case
success or
failure
Determine
post-execution
comparison
success or
failure
Run test cases (including
any dynamic
comparisons) and post-
execution comparisons
Manual Tasks
Tool Tasks
Testware Architecture
 Testware – all the artifacts required for testing
 Architecture – arrangement of all of these artifacts
 Test Sets – logical collection of testware artifacts
 Test Suite – collection of Test Sets to meet a given test
objective
 Testware Library – a repository of the master
versions of all Testware Sets
Testware Architecture…
Test Suite
Script Set Test Set Data Set Utility
Set
Scripts
Input
Documentation
Expected
Outcome
Data Utilities
Configuration
Items
Testware
Artifacts
Baseline
Automating Pre & Post Processing
Manual Process
Pre & Post Processing
Select/Identify test cases to run
Set up test environment:
• Create test environment
• Load test data
Repeat for each test case:
• Set up test prerequisites
• Execute
• Compare results
• Log results
• Clear up after test case
Clean up test environment:
• Delete unwanted data
• Save important data
Summarize Results
Analyze test failures
Report defects
Automating Pre & Post Processing…
 Pre-processing tasks – Create, Check, Reorganize,
Convert
 Post-processing tasks – Delete, Check, Reorganize,
Convert
 Processing at different stages
 What should happen after test case execution?
Limitations of Automation
 Does not replace manual testing
 Manual tests find more defects than automated tests
 Greater reliance on the quality of the tests
 Test automation does not improve effectiveness
 Test automation may limit software development
 Tools have no imagination
Career Opportunities
 Test Automation Architect – designs the overall structure
of the automation
 Test Automator – responsible for designing, writing,
and maintaining the automation software
 Bridge between the Tester and the Tool
 Good Programming Skills – SDET
 Scripting – Perl, Python, Shell, sed, AWK etc
 Debugging and Analysis
References
 Software Test Automation - Dorothy Graham and
Mark Fewster
 Experience of Test Automation - Dorothy Graham
and Mark Fewster
 Presentations and White Papers from cigital.com

More Related Content

What's hot (20)

PPT
Automated Testing with Agile
Ken McCorkell
 
PPT
Test automation process
Bharathi Krishnamurthi
 
PPT
Test Automation Framework Designs
Sauce Labs
 
PDF
Test Automation
nikos batsios
 
PPT
Test Automation Best Practices (with SOA test approach)
Leonard Fingerman
 
PPT
Automation testing strategy, approach & planning
SivaprasanthRentala1975
 
PPT
Automated Testing vs Manual Testing
didev
 
PPT
Test Automation Strategies For Agile
Naresh Jain
 
PPT
Hybrid Automation Framework Development introduction
Ganuka Yashantha
 
PPTX
Test automation framework
QACampus
 
PDF
Web automation using selenium.ppt
Ana Sarbescu
 
PDF
Automation Testing using Selenium
Naresh Chintalcheru
 
PDF
Building a Test Automation Strategy for Success
Lee Barnes
 
PPT
Selenium Automation Framework
Mindfire Solutions
 
PDF
Automation testing introduction for FujiNet
Hai Tran Son
 
PPT
Automated Testing vs Manual Testing
Directi Group
 
PDF
Automated vs manual testing
Kanoah
 
PPTX
How to Design a Successful Test Automation Strategy
Impetus Technologies
 
PPTX
Introduction to Selenium Web Driver
Return on Intelligence
 
PDF
Introduction to Software Test Automation
Amr Ali (ISTQB CTAL Full, CSM, ITIL Foundation)
 
Automated Testing with Agile
Ken McCorkell
 
Test automation process
Bharathi Krishnamurthi
 
Test Automation Framework Designs
Sauce Labs
 
Test Automation
nikos batsios
 
Test Automation Best Practices (with SOA test approach)
Leonard Fingerman
 
Automation testing strategy, approach & planning
SivaprasanthRentala1975
 
Automated Testing vs Manual Testing
didev
 
Test Automation Strategies For Agile
Naresh Jain
 
Hybrid Automation Framework Development introduction
Ganuka Yashantha
 
Test automation framework
QACampus
 
Web automation using selenium.ppt
Ana Sarbescu
 
Automation Testing using Selenium
Naresh Chintalcheru
 
Building a Test Automation Strategy for Success
Lee Barnes
 
Selenium Automation Framework
Mindfire Solutions
 
Automation testing introduction for FujiNet
Hai Tran Son
 
Automated Testing vs Manual Testing
Directi Group
 
Automated vs manual testing
Kanoah
 
How to Design a Successful Test Automation Strategy
Impetus Technologies
 
Introduction to Selenium Web Driver
Return on Intelligence
 
Introduction to Software Test Automation
Amr Ali (ISTQB CTAL Full, CSM, ITIL Foundation)
 

Viewers also liked (14)

DOCX
Automation Frame works Instruction Sheet
vodQA
 
PPT
Introduction to Gauge
vodqancr
 
PDF
Create the Future - Innovations in Testing
Anand Bagmar
 
PDF
Arjuna - Reinventing the Test Automation Wheels
Rahul Verma
 
PDF
Need for automation testing
99tests
 
PDF
Say NO To (More) Selenium Tests
Anand Bagmar
 
PDF
ICT for Automotive Industry
Interlatin
 
PPTX
Specifications test automation pyramid public
Sathyan Sethumadhavan
 
PDF
13 Test Automation Practices You Should be Afraid Of
Joe Colantonio
 
PPT
Tw specifications for-testing1
ThoughtWorks Studios
 
PPTX
Learning's from mobile testing
Vikrant Chauhan
 
PDF
Introduction to Test Automation
Pekka Klärck
 
PDF
Test Automation - Principles and Practices
Anand Bagmar
 
PPTX
How to be an awesome test automation professional
Kushan Shalindra Amarasiri - Technical QE Specialist
 
Automation Frame works Instruction Sheet
vodQA
 
Introduction to Gauge
vodqancr
 
Create the Future - Innovations in Testing
Anand Bagmar
 
Arjuna - Reinventing the Test Automation Wheels
Rahul Verma
 
Need for automation testing
99tests
 
Say NO To (More) Selenium Tests
Anand Bagmar
 
ICT for Automotive Industry
Interlatin
 
Specifications test automation pyramid public
Sathyan Sethumadhavan
 
13 Test Automation Practices You Should be Afraid Of
Joe Colantonio
 
Tw specifications for-testing1
ThoughtWorks Studios
 
Learning's from mobile testing
Vikrant Chauhan
 
Introduction to Test Automation
Pekka Klärck
 
Test Automation - Principles and Practices
Anand Bagmar
 
How to be an awesome test automation professional
Kushan Shalindra Amarasiri - Technical QE Specialist
 
Ad

Similar to Test Automation (20)

PPSX
QA with Microsoft Test Manager and Lab Management
Rofiqi Setiawan
 
PPT
Automation Concepts
Nishant Worah
 
PPSX
qawithmicrosofttestmanagerandlabmanagement
sunil singh
 
PPT
Automated Software Testing Framework Training by Quontra Solutions
Quontra Solutions
 
PPT
Software Testing
Kiran Kumar
 
PPT
Guideto Successful Application Test Automation
aimshigh7
 
PDF
Planning & building scalable test infrastructure
Vijayan Reddy
 
PPT
SAP Test automation - fully automatic test of complex business processes incl...
Tobias Trapp
 
PDF
03 test specification and execution
Clemens Reijnen
 
PPT
ISTQB / ISEB Foundation Exam Practice - 6
Yogindernath Gupta
 
PPS
Final Automation Testing
priya_trivedi
 
PDF
Test automation
Jitendra Malviya
 
PDF
Software Quality and Test Strategies for Ruby and Rails Applications
Bhavin Javia
 
PPTX
#DOAW16 - DevOps@work Roma 2016 - Testing your databases
Alessandro Alpi
 
PPT
Class17
makesame
 
PPTX
Automation Testing
Mphasis
 
PDF
Lecture #6. automation testing (andrey oleynik)
Andrey Oleynik
 
PPT
Introduction to Parasoft C++TEST
Engineering Software Lab
 
PPT
JoshiTestAutomation.ppt
FaisalBasra3
 
PPT
12 Rational Solo Pruebas 2009
Pepe
 
QA with Microsoft Test Manager and Lab Management
Rofiqi Setiawan
 
Automation Concepts
Nishant Worah
 
qawithmicrosofttestmanagerandlabmanagement
sunil singh
 
Automated Software Testing Framework Training by Quontra Solutions
Quontra Solutions
 
Software Testing
Kiran Kumar
 
Guideto Successful Application Test Automation
aimshigh7
 
Planning & building scalable test infrastructure
Vijayan Reddy
 
SAP Test automation - fully automatic test of complex business processes incl...
Tobias Trapp
 
03 test specification and execution
Clemens Reijnen
 
ISTQB / ISEB Foundation Exam Practice - 6
Yogindernath Gupta
 
Final Automation Testing
priya_trivedi
 
Test automation
Jitendra Malviya
 
Software Quality and Test Strategies for Ruby and Rails Applications
Bhavin Javia
 
#DOAW16 - DevOps@work Roma 2016 - Testing your databases
Alessandro Alpi
 
Class17
makesame
 
Automation Testing
Mphasis
 
Lecture #6. automation testing (andrey oleynik)
Andrey Oleynik
 
Introduction to Parasoft C++TEST
Engineering Software Lab
 
JoshiTestAutomation.ppt
FaisalBasra3
 
12 Rational Solo Pruebas 2009
Pepe
 
Ad

Recently uploaded (20)

PPTX
Agile Chennai 18-19 July 2025 Ideathon | AI Powered Microfinance Literacy Gui...
AgileNetwork
 
PDF
The Future of Mobile Is Context-Aware—Are You Ready?
iProgrammer Solutions Private Limited
 
PPTX
Introduction to Flutter by Ayush Desai.pptx
ayushdesai204
 
PDF
Trying to figure out MCP by actually building an app from scratch with open s...
Julien SIMON
 
PDF
Responsible AI and AI Ethics - By Sylvester Ebhonu
Sylvester Ebhonu
 
PDF
Presentation about Hardware and Software in Computer
snehamodhawadiya
 
PDF
How ETL Control Logic Keeps Your Pipelines Safe and Reliable.pdf
Stryv Solutions Pvt. Ltd.
 
PDF
introduction to computer hardware and sofeware
chauhanshraddha2007
 
PDF
Tea4chat - another LLM Project by Kerem Atam
a0m0rajab1
 
PPTX
Farrell_Programming Logic and Design slides_10e_ch02_PowerPoint.pptx
bashnahara11
 
PPTX
OA presentation.pptx OA presentation.pptx
pateldhruv002338
 
PDF
The Past, Present & Future of Kenya's Digital Transformation
Moses Kemibaro
 
PPTX
AI in Daily Life: How Artificial Intelligence Helps Us Every Day
vanshrpatil7
 
PPTX
AI and Robotics for Human Well-being.pptx
JAYMIN SUTHAR
 
PDF
Make GenAI investments go further with the Dell AI Factory
Principled Technologies
 
PDF
Lecture A - AI Workflows for Banking.pdf
Dr. LAM Yat-fai (林日辉)
 
PDF
Researching The Best Chat SDK Providers in 2025
Ray Fields
 
PDF
Economic Impact of Data Centres to the Malaysian Economy
flintglobalapac
 
PDF
Peak of Data & AI Encore - Real-Time Insights & Scalable Editing with ArcGIS
Safe Software
 
PPTX
Agentic AI in Healthcare Driving the Next Wave of Digital Transformation
danielle hunter
 
Agile Chennai 18-19 July 2025 Ideathon | AI Powered Microfinance Literacy Gui...
AgileNetwork
 
The Future of Mobile Is Context-Aware—Are You Ready?
iProgrammer Solutions Private Limited
 
Introduction to Flutter by Ayush Desai.pptx
ayushdesai204
 
Trying to figure out MCP by actually building an app from scratch with open s...
Julien SIMON
 
Responsible AI and AI Ethics - By Sylvester Ebhonu
Sylvester Ebhonu
 
Presentation about Hardware and Software in Computer
snehamodhawadiya
 
How ETL Control Logic Keeps Your Pipelines Safe and Reliable.pdf
Stryv Solutions Pvt. Ltd.
 
introduction to computer hardware and sofeware
chauhanshraddha2007
 
Tea4chat - another LLM Project by Kerem Atam
a0m0rajab1
 
Farrell_Programming Logic and Design slides_10e_ch02_PowerPoint.pptx
bashnahara11
 
OA presentation.pptx OA presentation.pptx
pateldhruv002338
 
The Past, Present & Future of Kenya's Digital Transformation
Moses Kemibaro
 
AI in Daily Life: How Artificial Intelligence Helps Us Every Day
vanshrpatil7
 
AI and Robotics for Human Well-being.pptx
JAYMIN SUTHAR
 
Make GenAI investments go further with the Dell AI Factory
Principled Technologies
 
Lecture A - AI Workflows for Banking.pdf
Dr. LAM Yat-fai (林日辉)
 
Researching The Best Chat SDK Providers in 2025
Ray Fields
 
Economic Impact of Data Centres to the Malaysian Economy
flintglobalapac
 
Peak of Data & AI Encore - Real-Time Insights & Scalable Editing with ArcGIS
Safe Software
 
Agentic AI in Healthcare Driving the Next Wave of Digital Transformation
danielle hunter
 

Test Automation

  • 2. Introduction  Automation – Testing which can be done programmatically.  Far more efficient than manual testing  More complex than it appears
  • 3. Testing and Automation are Different Economic Evolvable Exemplary Effective Automated tests (after many runs) Manual test First run of automated test
  • 4. Promises of Test Automation  Run existing tests on a new version of a program  Run more tests more often  Perform tests which would be difficult or impossible to do manually  Better use of resources
  • 5. Promises of Test Automation…  Consistency and repeatability of tests  Reuse of tests  Earlier time to market  Increased confidence
  • 6. Automation and Agile Development Team Create Application Code Create Tests Automate Tests QA Testers Automation Team Create Application Code Create Tests Automate Tests Create Application Code Create Tests Automate Tests Start of Development End of Iteration 1 End of Iteration 2 End of Iteration 3 Continuous Integration Continuous Integration Continuous Integration
  • 7. What to Automate? Identify Design Build Execute Check Governs the quality of tests Good to automate Intellectual Clerical
  • 8. Scripting  Test Script – data and/or instructions with a formal syntax, used by a test execution automation tool, typically held in a file.  Writing scripts is much like writing a computer program.  Reduce amount of scripting.
  • 9. Attributes of a Script Set  Number of Scripts  Size of Scripts  Function  Documentation  Reuse  Structured  Maintenance
  • 10. Automated Comparison  Verification by comparison  Dynamic comparison  Post-execution comparison  Integration of test execution and post-execution comparison
  • 11. Automated Comparison… Start test tool, select and run test cases Determine test case success or failure Run comparator(s) Determine post-execution comparison success or failure Run test cases (including any dynamic comparisons) Perform comparisons Manual Tasks Tool Tasks
  • 12. Automated Comparison… Start test tool, select and run test cases Determine test case success or failure Determine post-execution comparison success or failure Run test cases (including any dynamic comparisons) and post- execution comparisons Manual Tasks Tool Tasks
  • 13. Testware Architecture  Testware – all the artifacts required for testing  Architecture – arrangement of all of these artifacts  Test Sets – logical collection of testware artifacts  Test Suite – collection of Test Sets to meet a given test objective  Testware Library – a repository of the master versions of all Testware Sets
  • 14. Testware Architecture… Test Suite Script Set Test Set Data Set Utility Set Scripts Input Documentation Expected Outcome Data Utilities Configuration Items Testware Artifacts Baseline
  • 15. Automating Pre & Post Processing Manual Process Pre & Post Processing Select/Identify test cases to run Set up test environment: • Create test environment • Load test data Repeat for each test case: • Set up test prerequisites • Execute • Compare results • Log results • Clear up after test case Clean up test environment: • Delete unwanted data • Save important data Summarize Results Analyze test failures Report defects
  • 16. Automating Pre & Post Processing…  Pre-processing tasks – Create, Check, Reorganize, Convert  Post-processing tasks – Delete, Check, Reorganize, Convert  Processing at different stages  What should happen after test case execution?
  • 17. Limitations of Automation  Does not replace manual testing  Manual tests find more defects than automated tests  Greater reliance on the quality of the tests  Test automation does not improve effectiveness  Test automation may limit software development  Tools have no imagination
  • 18. Career Opportunities  Test Automation Architect – designs the overall structure of the automation  Test Automator – responsible for designing, writing, and maintaining the automation software  Bridge between the Tester and the Tool  Good Programming Skills – SDET  Scripting – Perl, Python, Shell, sed, AWK etc  Debugging and Analysis
  • 19. References  Software Test Automation - Dorothy Graham and Mark Fewster  Experience of Test Automation - Dorothy Graham and Mark Fewster  Presentations and White Papers from cigital.com

Editor's Notes

  • #2: Objective of this presentation is to give you all an introduction to Test Automation, its importance in the context of today's agile methodologies, things involved in automation, its complexity and limitations. If you are a student or early stage of your career, I intend to present a career option in front of you and generate curiosity so that you would further research into it based on the material I have listed at the end of the presentation.
  • #3: Done programmaticallyFar more efficiently - A mature test automation regime will allow testing at the 'touch of a button' with tests run overnight when machines would otherwise be idle.Automated tests are repeatable, using exactly the same inputs in the same sequencetimeandagain, something that cannot be guaranteedwith manual testing. Automated testing enables even the smallest of maintenance changes to be fully tested with minimal effort.At first glance, it seems easy to automate testing: just buy one of the popular test execution tools, record the manual tests, and play them back whenever you want to. Unfortunately, as those who tried it have discovered, it doesn't work like that in practice.
  • #4: Before we go into details of Automation, I would like to highlight that Automation is different from Testing.Testing – A Skill. Depends on quality test cases. Test cases has 4 attributes.Effectiveness – Whether or not it finds defects, or at least whether or not it is likely to find defects.Exemplary – An exemplary test case should test more than one thing, thereby reducing the total number of test cases required.Economic – How economical a test case is to perform, analyze, and debug.Evolvable – How much maintenance effort is required on the test case each time the software changes.Automation – Skill, of different kind.Manual vs Automation wrt 4 attributesWhether a test is automated or performed manually affects neither its effectiveness nor how exemplary it is.It doesn't matter how clever you are at automating a test or how well you do it, if the test itself achieves nothing then the end result is a test that achieves nothing faster.Once implemented, an automated test is generally much more economic, the cost of running it being a mere fraction of the effort to perform it manually. However, automated tests generally cost more to create and maintain.Roles of Tester vs Roles of Test AutomatorThe person who builds and maintains the artifacts associated with the use of a test execution tool is the test automator. A test automator may or may not also be a tester; he or she may or may not be a member of a test team. For example, there may be a test team consisting of user testers with business knowledge and no technical software development skills.
  • #5: Other than efficiency, lets quickly see other benefits of automation:Regression Testing - In an environment where many programs are frequently modified,the effort involved in performing a set of regression tests should be minimal.A clear benefit of automation is the ability to run more tests in less time and therefore to make it possible to run them more often. This will lead to greater confidence in the system.Attempting to perform a full-scale live test of an online system with say200 users may be impossible, but the input from 200 users can be simulated usingautomated tests.Better use of resourcesAutomating menial and boring tasks, such as repeatedly entering the same test inputs, gives greater accuracy as well as improved staff morale, and frees skilled testers to put more effort into designing better test cases to be run.Machines that would otherwise lie idle overnight or at the weekend can be used to run automated tests.
  • #6: Consistency and repeatability of testsTests that are repeated automatically will be repeated exactly every time. This gives a level of consistency to the tests which is very difficult to achievemanually.The same tests can be executed on different hardware configurations, using different operating systems, or using different databases.This gives a consistency of cross-platform quality for multi-platform products which is virtually impossible to achieve with manual testing.Reuse of tests The effort put into deciding what to test, designing the tests, and building the tests can be distributed over many executions of those tests. Tests which will be reused are worth spending time on to make sure they are reliable.NEED EXAMPLE Once a set of tests has been automated, it can be repeated far more quickly than it would be manually, so the testing elapsed time can be shortened. Knowing that an extensive set of automated tests has run successfully, there can be greater confidence that there won't be any unpleasant surprises when the system is released(providing that the tests being run are good/effective tests!)
  • #7: As agile development becomes more prevalent, automation becomes more important.Continuous integration is test automation; regression tests are run every day, ifnot more often.The automation also needs to be responsive to change, just as agile development is, so the testware architecture is more critical.Testautomation is successfulin traditional as well as agile development, but agile development cannot succeed without test automation.
  • #8: Let look at the test activities because these are the activities that we may want to automate:Identify – Determine ‘what’ can be testedCould be done in parallel with the development activityDesign – Determine ‘how’ to testTest case design will produce a numberof tests comprising specific input values, expected outcomes, and any other information needed for the test to run, such as environment prerequisites.Build – Implement test scripts, test inputs, test data and expected outcomes for comparison etcExecute – Execute the test casesCheck – Compare test case outcomes to expected outcomes As shown here, the first two test activities, identify test conditions and design test cases, are mainly intellectual in nature. The last two activities, execute test casesand compare test outcomes, are more clerical in nature. It is the intellectual activities that govern the quality of the test cases. The clerical activities areparticularly labor intensive and are therefore well worth automating. The activities of test execution and comparison are repeated many times, while the activities of identifying test conditions and designing test cases are performed only once (except for rework due to errors in those activities). For example: If a test finds an error in the software If a test fails for an environmental reason such as incorrect test data being used If tests are to be run on different platforms It is in automating the latter test activities where there is most to gain.
  • #9: A test script is the data and/or instructions with a formal syntax, used by a test execution automation tool, typically held in a file. A test script can implement one or more test cases, navigation, set-up or clear-up procedures, or verification.Test scripts that you produce should be properly engineered. Writing scripts is much like writing a computer program.Although test scripts cannot be done away with altogether, using different scripting techniques can reduce the size and number of scripts and their complexity.One of the benefits of editing and coding scripts is to reduce the amount of scripting necessary to automate a set of test cases. This is achieved in 2 ways:One way is to code relatively small pieces of script that each perform a specific action or task that is common to several test cases. Each test case that needs to perform one of the common actions can then use the same script. The other way to reduce scripting is to insert control structures into the scripts to make the tool repeat sequences of instructions without having to code multiple copies of the instructions.
  • #10: Number of Scripts - Fewer (Less than one script for each test case)Size of Scripts – Small, with annotation, no more than two pagesFunction - Each script has a clear, single purposeDocumentation - Specific documentation for users and maintainers, clear, succinct and up-to-dateReuse - Many scripts reused by different test casesStructured - Easy to see and understand the structure and therefore to make changes; following good programming practices, well-organized control constructsMaintenance - Easy to maintain; changes to software only require minor changes to a few scripts
  • #11: Test verification is the process of checking whether or not the software has produced the correct outcome. This is achieved by performing one or more comparisons between an actual outcome of a test and the expected outcome of that test (i.e. the outcome when the software is performing correctly). Some tests require only a single comparison to verify their outcome while other tests may require several comparisons. For example, a test case that has entered new information into a database may require at least two comparisons, one to check that the information is displayed on the screen correctly and the other to check that the information is written to the database successfully.When automating test cases, the expected outcomes have either to be prepared in advance or generated by capturing the actual outcomes of a test run. In the latter case the captured outcomes must be verified manually and saved as the expected outcomes for further runs of the automated tests. This is called reference testing.An automated comparison tool, normally referred to as a 'comparator,' is a computer program that detects differences between two sets of data. For test automation this data is usually the outcome of a test run and the expected outcome.Dynamic Comparison=============Dynamic comparison is the comparison that is performed while a test case is executing. Test execution tools normally include comparator features that are specifically designed for dynamic comparison. Dynamic comparison is perhaps the most popular because it is much better supported by commercial test execution tools, particularly those with capture/replay facilities.Dynamic comparison is best used to check things as they appear on the screen in much the same way as a human tester would do.Dynamic comparison can be used to help program some intelligence into a test case, to make it act differently depending on the output as it occurs. For example, if an unexpected output occurs it may suggest that the test script has become out of step with the software under test, so the test case can be aborted rather than allowed to continue. Letting test cases continue when the expected outcome has not been achieved can be wasteful.More Complex - test cases that use many dynamic comparisons take more effort to create, are more difficult to write correctly (more errors arc likely so more script debugging will be necessary), and will incur a higher maintenance cost.Post-execution comparison================Post-execution comparison is the comparison that is performed after a test case has run. It is mostly used to compare outputs other than those that have been sent to the screen, such as files that have been created and the updated content of a database.Passive---------If we simply look at whatever happens to be available after the test case has been executed, this is a passive approach.Active-------If we intentionally save particular results that we are interested in during a test case, for the express purpose of comparing them afterwards, this is an active approach to post-execution comparison.
  • #12: when a test case requires one or more post-execution comparisons, it is usually a different tool that performs it. In this situation the test execution tool may not run the post-execution comparator(s) of its own accord so we will have to run the comparator(s) ourselves. Figure 4.1 shows this situation in terms of the manual and automated tasks necessary to complete a set of 'automated' test cases. Figure 4.1 does not look much like efficient automated testing and indeed it is not. It would be nice if the test execution tool were responsible for running the comparator, but unless we tell it to do so, and tell it how to do so, it is not. To make the test execution tool perform the post-execution comparisons we will have to specifically add the necessary instructions to the end of the test script. This can amount to a significant amount of work, particularly if there are a good number of separate comparisons to be performed.
  • #13: Even when we have added the instructions to perform the post-execution comparison we may not have solved the whole problem. Figure 4.2 shows why. The test execution tool will probably be able to tell us that the test case ran successfully (or not) but it may not tell us anything about the results of the post-execution comparisons. Assessing the results of the post-execution comparison is then a manual task. We have to look in two places to determine the final status of the test case run: the execution tool's log or summary report and the output from the comparator tool(s). In an ideal world the interface between the test execution tool and the post-execution comparators would be seamless, but there is usually a gap that we have to fill ourselves.
  • #14: Testware is the term we use to describe all of the artifacts required for testing, including documentation, scripts, data, and expected outcomes, and all the artifacts generated by testing, including actual outcomes, difference reports, and summary reports.Architecture is the arrangement of all of these artifacts; that is, where they are stored and used, how they are grouped and referenced, and how they are changed and maintained. Testware:Test Materials – Input, Scripts, Data, Documentation, Expected OutcomeTest ResultsProducts – Actual OutcomeBy Products – Log, Stats, Reports
  • #15: We divide the test materials into logical sets that we call Test Sets. Each Test Set contains one or more test cases. Normally Test Sets would contain a few tens of test cases but they may contain a few hundred or, at the other extreme, a single test case.A Test Suite is simply a collection of Test Sets and therefore contains all the test materials required to run the test cases contained within the Test Sets. There are two alternative ways of managing the configuration of the test-ware. The method that we favor is for the Testware Sets to be stored in the Testware Library as configuration items (that is, having a version number). The individual testware artifacts that make up the content of each type of set do not have their own version numbers. The effect of this is that whenever anything in the Testware Set is changed, a new version of the Testware Set is created containing the changed artifacts and the unchanged artifacts.