Blue Prism Development Test Plans Template v1.0
Blue Prism Development Test Plans Template v1.0
Configuration Testing
Context
The testing here will be a mixture of both formal and informal testing; some of it will consist of routine
automated system tests and others will be more process specific. The goal is to ensure that the process is in a
fit state to begin Verification with a Subject Matter Expert (SME).
Environment
All testing will take place in the Blue Prism Configuration Environment.
Any target system processing will be initially performed in a test system where applicable.
Currently a test Customer Gateway system is available.
It is essential that the test target system is a precise mirror of the production system.
Testing Scope
The verification phase focuses mainly on confirming interface functionality within the process – ensuring
that objects and components have been correctly created or modified, and are correctly used from within the
process, as per design.
It may also include limited instances of Controlled Failure Tests – verifying the behaviour of the solution in
certain situations such as environment failure. Eg what if one of the target applications is unavailable?
Test Requirements
For configuration testing, the configuration environment and cases used for development can be used. There
are no additional environmental requirements.
Progression Criteria
There are not usually any strict progression criteria; instead the Blue Prism Configuration Analyst must be
satisfied that s/he is ready to proceed into the Verification Phase with the SME, and that the time spent
together will be productive.
A project manager will often wish to discuss the nature of the testing conducted so far with the Blue Prism
Configuration Analyst, and s/he should be prepared for this.
Scenario Testing
Context
The verification phase forms part of configuration, and therefore the automated solution will still be
incomplete. The purpose of configuration is twofold: firstly it allows new screens to be configured which can
only be reached using live data (eg the "Account Closed" screen can only be seen after closing an account);
secondly, it allows a large variety of business scenarios to be tested so that confidence can be established in
the solution ahead of Acceptance Testing.
Since scenarios are being verified (and in some cases configured) for the very first time, the solution is not
considered "ready" – the verification phase is the means by which the solution will arrive at this state.
Environment
Verification takes place in the configuration environment, because an agile and rapid improvement cycle will
take place – the verification phase is indeed part of the configuration phase.
Testing Scope
The verification phase focuses mainly on Business Scenario Verification – configuring the process to
correctly deal with each business scenario, as per the design.
It may also include limited instances of Controlled Failure Tests – verifying the behaviour of the solution in
certain situations such as environment failure. Eg What if an input source does not match the expected file
format?
Test Requirements
· Sources of both real life and imaginary data representing the defined scenarios
· Subject Matter Experts (SMEs) to help conduct testing and verify outputs
· Sufficient volumes of real-world data to satisfy the Live-Data Testing requirements.
Progression Criteria
An informal set of criteria will be required to define the end of the Verification Phase. Typically, this will be
a minimum set of scenarios and volumes to be worked.
Reference Training
The following guidelines will help complete training for this delivery documentation.
Title
Lifecycle
Orientation
Delivery Roadmap
Lifecycle
Orientation Sample
Delivery Documents
Process Delivery
Methodology
Test Phases
Overview
Testing Approach
Blue Prism -
Introducing Your
Process to Live Data
nce Training
wing guidelines will help complete training for this delivery documentation.
Description
This is a Blue Prism portal page, providing a brief explanation of the Blue Prism
Lifecycle Orientation and related documents.
Blue Prism portal path: Home> Learning> Lifecycle Orientation
This document describes the end-to-end steps in creating and delivering a Blue Prism
process solution. The key process phases are outlined from Initial Process Assessment
through to Testing.
Blue Prism portal path: Home> Documents
All prescribed delivery documents are fully completed. These are referenced within the
Delivery Roadmap and provide an example of the content and level of detail required.
Blue Prism portal path: Home> Documents
The Blue Prism Process Delivery Methodology is a proven means of delivering ongoing
business benefit through process automation using a controlled and structured
Automation Framework.
Blue Prism portal path: Home> Documents
This Test Phases document describes the standard test phases during a Blue Prism project
to ensure that automated solutions are delivered into live with the optimum possible level
of testing throughout development to ensure that processes are delivered that meet
business requirements and contain the minimum possible levels of system exceptions.
Blue Prism portal path: Home> Documents
This document is a guidelines of the testing approaches that should be considered when
testing RPA solutions.
Blue Prism portal path: Home> Documents
This guide outlines the methods available on how to introduce your process to live data.
It should be considered prior to defining your delivery methodology and test approach.
Blue Prism portal path: Home> Documents
TEST PLAN TIMELINE
Process
Project Manager:
Name:
Developer:
SME:
Testing Cycle Scenario Coverage
Verification Cycle 1 Work 1 cases and sample 100% to ensure all actions are correct.
UAT test scenarios will aim to have a confidence level of 80% however this may not be possible under certain live data
Verification Cycle 2 Work 4 cases and sample 100% to ensure all actions are correct. conditions
Verification Cycle 3 Work 10 cases and sample 100% to ensure all actions are correct.
UAT Cycle 4 Work 20 cases and sample 100% to ensure all actions are correct.
UAT Cycle 5 Work 50 cases and sample 70% to ensure all actions are correct. If we cannot achieve 80% under live data UAT testing then the Business will have to decide to accept the risks or continue
testing
UAT Cycle 6 Work 100 cases and sample 30% to ensure all actions are correct.
Go / No Go Decision
Verification Cycle 1
Verification Cycle 2
Verification Cycle 3
In Progress
UAT Complete
UAT Cycle 4
UAT Cycle 5
UAT Cycle 6
Go Live Day
Warranty
Completed
Test
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy BANK HOLIDAY UK
dd/mm/yyyy CURRENT
dd/mm/yyyy POSITION
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
dd/mm/yyyy
CONFIGURATION TEST PLAN
Process Name:
ID # Name
The process can start and log into <System A> and <System B> using Credentials.
Number of Configuration Tests
Passed Configuration Tests
Notes
If the <System B> password expires the process will generate a new password and update the credentials.
2
0%
Pass?
No
SCENARIO TEST PLANNING
Process Name:
ID # Name
Description
To include one male one female and one child. In Scope Not Covered
FAULTS IDENTIFIED
Process Name:
ID Date Process
D001
D002
D003
D004
D005
D006
D007
D008
D009
D010
D011
D012
D013
D014
D015
D016
D017
D018
D019
D020
D021
D022
D023
D024
D025
D026
D027
D028
D029
D030
D031
D032
D033
D034
D035
D036
D037
D038
D039
D040
D041
D042
D043
D044
D045
D046
D047
D048
D049
D050
Fault Description
Status
Open
Open - Investigating
Closed - fixed
Closed - opened in error
Awaiting Retest
On Hold
TOTAL