0% found this document useful (0 votes)
38 views

Dharesh Resume

Uploaded by

juliansofia1989
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views

Dharesh Resume

Uploaded by

juliansofia1989
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

DHARESH KUMAR DAGA

Senior Software Engineer

Mobile: +91-79750-31187

E-mail: [email protected]

Professional Summary:

• Overall, 6.10 years of experience in the field of IT which includes Testing.


• Having experience in Data warehousing (ETL) Testing and Report Testing, Azure Data Factory,
Azure Data Bricks Testing.
• Learned Tableau and Power BI reporting tool for 6 months.
• Have worked on Snowflake database for data retrieval purpose for 3 months.
• Have worked on Teradata database for data retrieval purpose for a month.
• Have worked on Azure Data Factory tool for 3 Years.
• Have worked on Azure Data Bricks tool for 1 Year.
• Worked for 3 years into Retail, 1+ years into Insurance, 1 Years into Finance and 6 months into
Loans, 1 year into Life Science domains.
• Proficient in designing test cases, executing test cases for Data warehouse projects.
• Experience on end-to end project implementation, Data Models and ETL process.
• Hands on experience in writing SQL queries to validate the data.
• Good experience to Data Warehousing concepts.
• Good experience in Star and Snowflake schemas, Fact and Dimension Tables.
• Experience on Web Application Functional and System Testing.
• Experience on different databases like SQL Server, Oracle Database, Teradata, Snowflake
• Experience in Informatica Power Centre.
• Testing the ETL processes using Informatica to load data from various data sources to target.
• Experience in Basic Unix Command.
• Experience in Quality Centre test management tool.
• Experience in Jira Tool, Guid generator
• Experience in Job Scheduling.

Technical Skills:

Testing : ETL/DWH Testing & Manual Testing.

Operating Systems : Windows and UNIX.

Database : Oracle (10g, 11g), SQL Server (SSMS), Snowflake, Teradata, Azure SQL.

Test Management Tool : HP Quality Centre, HP-ALM, JIRA, X-Ray


ETL Tool : Informatica Workflow Monitor Window, SSIS, FileZilla RTL, Azure Data
Factory, Azure Data Bricks, BMC Control-M

API Tool : Swagger

Job Scheduling Tool : BMC Control M

Model : Agile Model

BI Tool : Tableau, Power BI

Certifications : Data Services Azure, SQL

completed in Turing.com

Professional Affiliations:
• Worked as a Senior Software Engineer as Full time in CGI ISMC, Bangalore from March 2023
to Oct 2023
• Worked as a Senior Software Engineer as Full time in APEXON, Bangalore from Feb 2022
to Feb 2023
• Worked as a Senior ETL Tester in TALENT-PLOYER Private Limited, Bangalore from Sep 2021
to Dec 2021
• Worked as a Data QA in Agile tech Info Solutions Private Limited, Bangalore from Feb 2021 to
May 2021
• Worked as a Software Engineer as Full time in U b q Technologies Private Limited, Bangalore
from Jan 2015 to July 2019

EDUCATION:
• Bachelor of Computer Applications (BCA) from KUD University.

Projects Details:

Project #1: Stone Eagle

Description: Stone Eagle is a vehicle insurance company that provides Creation of Contracts, Claims of
various product types, verify Insurance Coverage groups, rating facilities through its application like
Secure Admin Dealer, Dealer Portal. It works on various modules of the application like Agent, Insurers,
Admin Company and Products.
Roles and Responsibilities:
• Analysing business requirements through the mapping document.
• Validate Source & Target record Count.
• Validating the current & historical data and audit fields as part of SCD Type 2, Type 3
& Type 4 Validations.
• Validating transformations logic.
• Wrote scripts using PySpark, Python code in Visual Studio code, Azure Data Bricks.
• Used Azure Data Bricks for Data Ingestion & Testing the notebook, Pipelines Testing &
Data Ingestion in Azure Data Factory, merging the Data Frames, Extracting the data
through Azure Blob Storage, Azure Data Lake, Azure SQL & On-premises SQL Server.

Project #2: Ultra cam


Description:
Ultra cam brings together Life sciences domain expertise, third party system knowledge and Deep
technology to develop well defined solutions which addresses challenges across Medical Affairs,
Regulatory and Safety functions.

Roles and Responsibilities:


• Analysing business requirements through the mapping document.
• Validate the count, data mapping between application & database.
• Validate Source & Target record Count.
• Validating the current & historical data and audit fields as part of SCD Type 2, Type 3
& Type 4 Validations.
• Validating transformations logic.
• Logged the defects in Jira which is found during the testing.
• Used Azure Data Bricks for Data Ingestion & Testing the notebook, Pipelines Testing &
Data Ingestion in Azure Data Factory, merging the Data Frames, Extracting the data
through Azure Blob Storage, Azure Data Lake, Azure SQL & On-premises SQL Server.

PROJECT #3: CVS IDW


Description:
CVS sells prescription drugs and a wide assortment of general merchandise, including over-the-counter
drugs, beauty products and cosmetics, film and photo finishing services, seasonal merchandise, greeting
cards, and convenience foods through their CVS Pharmacy and Longs Drugs retail stores and online
through CVS.com.

Roles and Responsibilities:


• Analysing the business requirements through the Mapping Document.
• Searching for the file from the Archive Path and downloading in the local system
• DDL Comparison between Raw, Core Layers and with Mapping documents, Source
File Layout.
• Performing Initial Load, Incremental Load, Duplicate data load, already existing data
from the Core table.
• Validating the current & historical data and audit fields as part of SCD Type 2, Type
4
• Preparation of Test Data to perform test validation when data is not available in
source tables.
• Sending Observations/Expected results to the team.
• Analysis of Job Failures and altering the File as per the error found in log to load so
that file gets archived in Target Paths.
• Logged the defects in Jira which is found during the testing.

PROJECT #4: CIRRUS


Description:
It is one of the largest home loan servicers in the country focused on delivering a variety of servicing and
lending products, services and technologies.

Roles and Responsibilities:


• Analysing the business requirements in PBI’s.
• Understanding the phases of data movement from Source to Target and preparing
the test scenarios.
• Expertise in writing SQL statements for Initial and Incremental Load Testing as part
of SIT Testing as per the Acceptance Criteria.
• Attending Business meetings and Scrum Ceremonies.
• Validating the current & historical data and audit fields as part of SCD Type 2
Validation.
• Preparation of Test Data to perform test validation when data is not available in
source tables.
• Raising Defects in Visual Studio when Actual and Expected results are mismatches.
• Executing Swagger process to load the data from Source to Target.
• Logged the defects in Jira which is found during the testing.

Project #5: Costco


Project Title : Costco
Clients : Costco Retailer, USA
Duration : Jan 2017 to July 2019
ETL Tool : Azure Data Factory, Azure Databricks
Database : Azure DB, Oracle
Data Querying Tool : Toad
Ticketing Tool : Quality Centre.

Description:
Costco Retail Food Stores, which is one of the Organizations that sells various numbers of products.
Costco have various stores, which maintains Database has lots of historical data. This project was
to create Sales Data Warehouse and generate reports using Business Objects.
The aim of sales data warehouse is to analyse sales of major brands varying with different promotional
schemes. The system has a database which stores information about sales, profits and contacts of
customers. Using this existing database, the system should analyse the sales and profit and
generates reports for sales and profits in month wise and year wise. The resulting model gives us
an idea about growth in profit, prediction about increase in sales.

Roles and Responsibilities:


• Understanding Requirement Specifications and Design Documents.
• Involved in writing test cases and execution of test cases.
• Running the Jobs for ETL process.
• Prepared and run SQL scripts.
• Database testing and validating the data between source and target database.
• Prepared test data for testing.
• Logged the defects in QC which is found during the testing.
• Reporting to daily testing status report.
• Used Azure Data Bricks for Data Ingestion & Testing the notebook, Pipelines Testing &
Data Ingestion in Azure Data Factory, merging the Data Frames, Extracting the data
through Azure Blob Storage, Azure Data Lake, Azure SQL & On-premises SQL Server.

PROJECT #6: Aflac Life Insurance


Project Title : Aflac Life Insurance
Clients : Aflac
Duration : Jan 2016 to Dec 2016
ETL Tool : Informatica Power Centre
Database : Oracle, SSMS
Data Querying Tool : Toad
Ticketing Tool : Quality Centre.

Description:
Aflac Life Insurance is a complete Life insurance project and it is a web based. It is
mainly focused on Life insurance for a customer. This product reduces the financial
vulnerability to natural disasters through Life insurance coverage. It covers entire
functionality for a customer from the policy issued date till to the policy renewal date. It
broadly consists of following modules like Agent management, New Business
Management, Policy Management, Product Management, Accounts, Reinsurance and
Underwriting.

Roles and Responsibilities:


• Understanding Requirement Specifications and Design Documents.
• Involved in writing test cases and execution of test cases.
• Experience on complete implementation of project and understanding end to end flow of
ETL jobs and logic.
• Running the Jobs for ETL process.
• Prepared and ran SQL scripts.
• Verifying the data in target database.
• Prepared test data for testing.
• Logged the defects in QC which is found during the testing.
• Reporting to daily testing status report.

PROJECT #7: Client Task Management System


Project Title : Client Task Management System
Clients : Qual FINZ, Malaysia
Duration : Jan 2015 to Dec 2015
ETL Tool : Informatica Power Centre
Database : Oracle
Data Querying Tool : Toad
Ticketing Tool : Quality Centre.

Description:
Qual FINZ is a company suitable for accounting operations, having several clients.
The need of this project is to store the information of payments like monthly, quarterly,
annual and income tax payments of all clients. Also, we need to maintain the data like
ESI/PF and Labour payments data.

Roles and Responsibilities:


• Analysis of requirement specification document.
• Prepared test scenarios and test cases.
• Experience on project implementation, data models and ETL process.
• Hands on experience on writing complex queries for data validation as per business
requirements.
• Involved in system testing and regression testing.
• Executing test cases.
• Tracking defects using quality centre.

You might also like