Possible Test Scenarios and Test Cases for an ETL Project
Possible Test Scenarios and Test Cases for an ETL Project
Test
Scenarios
and Test
Cases for
an ETL
project
Test Scenario Test Cases
- It validates the structure of the source and target table against
corresponding mapping sheets.
- It validates either source data type, and destination data type is the
same or different.
- It verifies the length of data type for source and the destination.
- It verifies either data fields formats and types are specified or not.
Validation - It validates the name of a column against mapping doc.
Mapping Doc - It verifies the mapping doc either related information is given or not.
Validation - It also checks for change logs maintenance in every mapping doc.
- Even if the semantic definition is the same, data type and length may
Data Consistency vary in tables or fields.
Issues - It will check for integrity constraints either they are used well or not.
- It makes sure that data is transferred from the source to the destination
as expected.
- It compares the records count between the source and the destination.
- It ensures that data should not be truncated in columns of target tables.
- It will check for rejected records. - It will check for boundary value
Data Completeness analysis.
Issues - It will check for unique key attributes for the loaded data.
- It validates these constraints are defined for the specific table as
Constraint Issues expected.
Data Correctness - It checks either data is recorded or spelled well or not.
Issues - It checks Null, non-unique out of range data.
Data
Transformation - It checks data for transformation either it has converted to the right
Issues format or not.
- It will validate the data on different parameters like number check,
Data quality issues precision check, date check, data check, or null check, etc.
Null validate - It validates the data for Null and non-null values.
- It will check the data for duplicate values and values of columns should
be unique as per the business requirement once they are defined as the
Duplicate Checks primary key or unique key.
Data Validation - It is performed to know the row creation date.
- It verifies the list of active records on ETL development perspective.
- It verifies active records based on business requirements.
Data cleaning - It makes sure that unwanted data is deleted before it is transferred to
issues the destination database.
- It is used to validate the complete data in the source and the
destination.
- It is used to match rows between the source and the destination.
Complete Data - It ensures that count returned by intersection matches with individual
Validation counts of source and the destination.