0% found this document useful (0 votes)
49 views7 pages

Dipanjan Pan

This document provides a summary of Dipanjan Pan's professional experience and qualifications. He has 9 years of experience developing ETL processes using tools like Informatica, DataStage, Snaplogic and Azure. He has worked on projects in various industries involving data migration, data warehousing and reporting. His skills include SQL, Unix, Azure, Oracle, databases and ETL tools. He has experience delivering projects using agile methodologies and working with clients both onshore and offshore.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
49 views7 pages

Dipanjan Pan

This document provides a summary of Dipanjan Pan's professional experience and qualifications. He has 9 years of experience developing ETL processes using tools like Informatica, DataStage, Snaplogic and Azure. He has worked on projects in various industries involving data migration, data warehousing and reporting. His skills include SQL, Unix, Azure, Oracle, databases and ETL tools. He has experience delivering projects using agile methodologies and working with clients both onshore and offshore.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 7

DIPANJAN PAN

Mail Id: [email protected]

Mob: +91 8392048439

Address: Dangapara, P.O- Kalna, Dist.-Burdwan, Pin-713409

INTRODUCTION
A qualified software professional with 9 years of experience in ETL development practices and
implementation of DWH applications using Azure cloud,Snaplogic,Datastage,Informatica,SQL.
Involve in requirements phase, analysis, design, planning and implementation of software
applications and worked end to end from raw data to reporting layer for BI application.
Worked in Agile safe methodology, Dev ops mode and have experience to deliver the client in
every sprint and PI as per requirement.

SKILLS SUMMARY
Operating System : Windows, Unix
Languages : SQL, Shell

Tools : Informatica, DataStage, Snaplogic, Databricks, ADF,Azure

Databases : Oracle, My SQL, Dynamo DB,S3,SQL Server

Certification : Snaplogic,Azure Data Fundamental(DP-900)

PROFESSIONAL EXPERIENCE(OFFSHORE)
 9 years of work experience in DWH concept using Informatica, DataStage, oracle,
SQL,Unix and Snaplogic, Databricks.
 Develope end to end data migration process using the data from different source
and process them through ADF and use Azure synapse and databricks to process
the organized data in database for reporting purpose.
 Expert in designing jobs using Informatica,DataStage and Snaplogic.
 Expert in oracle SQL developer with performance tuning in query and create
partitioning, Index, data analyzing for different issues.
 Proficient in SQL Queries, RDBMS Concept and Data Warehousing concepts.
 Good Team player with good interpersonal skills and ability to work in team.
 Ability to learn new technologies as per the requirement and changing times.
 Attended daily stand up and deliver in each sprint and PI in Agile.
 Expert in design sequence job and performance analysis of ETL jobs.
 Experienced in Database programming for Data Warehouses (Schemas), proficient
in dimensional modeling (Star Schema modeling, and Snowflake modeling).
 Have a good knowledge UNIX shell scripts using K-shell for the automation of
processes and scheduling the Data Stage jobs using wrappers.
 Experience in analyzing the data generated by the business process, defining the
granularity, source to target mapping of the data elements, creating Indexes and
Aggregate tables for the data warehouse design and development.
 Expert in both waterfall and dev ops model to deliver the project.
 Having basic industry experience which includes Telecom, Transportation and
Retail industries and having good knowledge in various SDLC phases.
 Excellent communication skills and co-ordination technique between the team and
with client.

PROFESSIONAL EXPERIENCE(ONSHORE)
 Moved to Sweden for client requirement as interection with client as Data
Engineer and gather requirement as per client and deliver that.
 Dealing with client in onshore and attended the meeting to discuss the business
benefit.
 Help the client to improve their business model to achieve more profitability.

EMPLOYMENT SUMMARY
Organization Designation Duration
IBM Application developer (04/2014) – (09/2019)
Accenture Soln PVT LTD Senior Application (10/2019)- (08/2021)
Developer
PwC SDC Senior Analyst (08/2021) -(10/2023)
Vrize India Pvt Ltd Principal Software Engg (12/2023)-Till Now

PROJECT SUMMARY
PROJECT #1

Client : Pepsico
Project Title : Pepsico DMS
Company : PwC, Bangalore
Duration : Jan 2023 to Oct 2023
Tools and technology used : Azure ADF, Synapse Analytics,Databricks,SQL Server

Project Description: -

Prepare power Bi report and analyze the source data and apply business rule to
create a clean report.
Roles and Responsibilities
 Involve in source data analysis from various source like excel, blob storage.
 Capture the data from source table in database and process file to Azure via ADF
and Databricks for further analysis.
 Develop building ADF pipeline to process the source data from Datalake for
downstream process.
 Use Databricks and Azure synapse analytics to implement various transformation
logic.
 Implementation of scd1 and scd2 to capture history and trend of data.
 Prepare organized data and stored in database for power Bi reporting.

PROJECT #2

Client : Trimble
Project Title : Trimble Data Migration
Duration : August 2021 to Dec 2022
Company : PwC, Bangalore.
Tools and technology used : Snaplogic, Azure Databrciks, Salesforce

Project Description: -

Trimble is a Us based client who create different product. Our goal is to transfer there
worldwide data from there several OLD system to Salesforce.

Roles and Responsibilities


 Develop all the pipelines for different interfaces from Salesforce to Salesforce,
Oracle to Salesforce, Flat file to Salesforce
 Use Databricks query for implementing different business logic.
 Use of parquet file to communicate Azure and Snaplogic for creating table in
databrciks
 Use Rest Snap for communicate with end system via JSON format.
 Analysis the data and load the data in marketing cloud for campaign of client.
 Create reusable asset for pipeline migration from one end to another.
 Attend Scrum meeting, PI planning, feature creation and attend meeting with client
for various requirement gathering.
 Manage some Juniors and responsible for END to End delivery
 Communcate with client to sove UAT and SIT bug efficiently.
 Part of two go live for this project to migrate data from different source system to
Salesforce.
 Working on Azure Databricks notebook to transform the data as per business
requirements.

PROJECT #3

Client : Workday
Project Title : Workday Platform integration
Duration : June 2020 to Aug 2021
Company : Accenture.
Tools and technology used : Snaplogic, Salesforce, Reltio,Postman

Project Description: -
 Workday is an HR management based Product. Our goal is to replicate there data on real time
basis or schedule basic to Salesforce to Reltio and vice versa.

Roles and Responsibilities


 Involved in analysis for the stories worked on.
 Develop all the pipelines for different interfaces from Salesforce to Workday
 Use different configuration to process data in real time like ultra pipeline and post create and
update in Workday via URL.
 Add business logic through mapped and used various snap like redshift execute, mapper, filter,
copy, pipeline execute, salesforce upsert, salesforce lookup and various other snap.
 Use Rest Snap for communicate with end system via JSON format.
 Uses of json formatter and json spliiter to formatting the data.
 Uses for various map function to formatting the data
 Worked on Reltio for master data management. Postman to call the trigger task through API
 Analysis the data and load the data in marketing cloud for campaign of client.
 Create reusable asset for pipeline migration from one end to another.
 Attend Scrum meeting, PI planning, feature creation and attend meeting with client for various
requirement gathering.

PROJECT #4

Client : Wendys
Project Title : Wendys Data Management
Duration : October 2019 to June 2020
Company : Accenture.
Tools and technology used : Snaplogic, Redshift, Salesforce, service cloud
Project Description: -
 Wendys is a food chain company. Our goal is to transfer their data from there on premise
database system to Salesforce service and marketing cloud using proper business logic.

Roles and Responsibilities


 Involved in development of new stories.
 Develop all the pipelines for different interfaces from AWS to salesforce.
 Use different source system like DynamoDB,S3 to extract file with incremental logic.
 Add business logic through mapped and used various snap like redshift execute, mapper, filter,
copy, pipelineexecute, salesforce upsert, salesforce lookup and various other snap.
 Use dynamo DB snap, Rest API to extract data from cloudbased engine.
 Worked on Reltio for master data management. Postman to call the trigger task through API
 Analysis the data and load the data in marketing cloud for campaign of client.
 Create reusable asset for pipeline migration from one end to another.
 Attend Scrum meeting, PI planning, feature creation and attend meeting with client for various
requirement gathering.

PROJECT #5

Client : IKEA
Project Title : IKEA Datawarehouse management system
Duration : Jan 2017 to Sep 2019
Company : IBM.
Tools and technology used : Datastage, Oracle, Sqldeveloper, Plsql, Unix,Cognos,Qlikview

Project Description: -
 Ikea is a retail industry. Our goal is to manage there daily sales report and various sales and
business report and manage their DWH system.

Roles and Responsibilities


 Requesting gathering.
 Develop code with proper business logic and create the application for different business unit.
 Join daily stand up, create story for features and deliver that in appropriate sprint.
 Analysis the data and develop job for a specific solution to build a DataMart.
 Provide KT to the juniors.
 Attend Scrum meeting, PI planning, feature creation.
 Communicating with the user and gather the requirement details for any code change.
 Performance tuning for the job in lookup and join stage using partitioning technique.
 Help the AMS team to solve any kind of data issue raised by end user.
 Involve in direct interaction with client and deliver the features to improve business profitability.
 Develop several pipeline to transfer data from source to target DB with various business logic
added in the pipeline.
 Direct interaction with client and discusses how to improve the business process by adding
new features in the existing application.

PROJECT #6

Client : Celcom
Project Title : Celcom sales rep
Duration : Nov 2014 to Dec 2016
Company : IBM.
Tools and technology used : Datastage,Oracle,Sql developer,Netezza,Plsql,Unix, Cognos

Project Description: -
 Celcom is a telecom company. Our goal was to maintain their data warehouse and
process their daily sales and business data.

Roles and Responsibilities


 Develop the ETL jobs as per business requirements.
 Populate the data in DataMart and load it in data ware house and prepare data for
reporting layer as per all business logic.
 Monitor the job and improve the performance based on preprod run and
configuration file in production.
 Gather information from senior architect and design the job flow in effective way.
 Validate that with some basic check like count, duplicate, scd1, scd2 check and
other test cases as per project requirement.
 Complete the SIT, UAT and deploy the code in prod once received go ahead from
business.
 Prepare DR document, BAT document and other deployment related document.
 Schedule all jobs and automate the process with framework.
 Discuss and ensure client standards incorporated in all designs and developments.
 Provide shadow support for 2 months and solve incident and job issue after go live.
 Provide KT to the support team and prepare handover related document after
production live.

PROJECT #7

Client : Oncor
Project Title : Smart Electric billing system
Duration : June 2014 to Dec 2016
Company : IBM.
Tools and technology used : Informatica, Oracle, Sql developer,Plsql

Project Description: -
 Our goal is to maintain the smart electric billing system data for Oncor and provide
billing system solution.
Roles and Responsibilities
 Develop jobs from different source like oracle, flatfile, csv file.
 Create table with proper data model.
 Schedule the job and monitor the performance of the job.

EDUCATION
Degree University Year of passing Percentage
B-tech WBUT 2013 73
Higher secondary WBCHSE 2009 76
Secondary WBBSE 2007 92

DECLARATION

I hereby declare that the information furnished above is true to the best of my knowledge.

Thanks & Regards,

Dipanjan Pan

PH NO: +91-8392048439

You might also like