0% found this document useful (0 votes)
13 views7 pages

Pavan_Sr_ETL_IDQ_MDM_Developer (1)

Pavan Kumar is a Lead Informatica ETL and MDM Developer with around 10 years of experience in data warehouse applications across various domains such as healthcare and finance. He has extensive expertise in Informatica tools, data integration, and data quality processes, along with strong skills in data modeling and PL/SQL programming. Currently, he is working with Travelers Indemnity, focusing on MDM architecture, data cleansing, and real-time data integration solutions.

Uploaded by

9346575548venky
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views7 pages

Pavan_Sr_ETL_IDQ_MDM_Developer (1)

Pavan Kumar is a Lead Informatica ETL and MDM Developer with around 10 years of experience in data warehouse applications across various domains such as healthcare and finance. He has extensive expertise in Informatica tools, data integration, and data quality processes, along with strong skills in data modeling and PL/SQL programming. Currently, he is working with Travelers Indemnity, focusing on MDM architecture, data cleansing, and real-time data integration solutions.

Uploaded by

9346575548venky
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 7

PAVAN KUMAR

[email protected]
510-697-5311
Lead Informatica ETL ,IDQ/MDM Developer
Professional Summary:

 Around 10 years of Software Life Cycle experience in System Analysis, Design, Development,
and Implementation, Maintenance, and Production support of Data Warehouse Applications.
 Extensive experience with Informatica Power Center 10.x,9.x,8.x, 7.xIDQ
10.2.1/9.0.1/8.6,MDM 10.3,B2B,PCF,node js, Teradata, SQL server and oracle databases
 Worked on various domains including Healthcare, Insurance, Finance domains
 Expertise in working with Informatica Designer, Informatica developer , Informatica MDM
Work Flow Manager, Work Flow Monitor, Source Analyzer, Target Designer, Transformation
Developer, Mapplet Designer, Mapping Designer, Workflow Designer, Task Developer, Worklet
Designer, Gant Chart, Task View, Mappings, Workflows, Sessions, Re-usable Transformations,
Shortcuts, Import and Export utilities.
 Experience in integration of various data sources with Multiple Relational Databases like Oracle,
Teradata, SQL Server and Worked on integrating data from flat files like fixed width, delimited,
CSV, HL7 (health level) and EPIC .EXT Files
 Good knowledge of Data warehouse concepts and principles (Kimball/Inman) - Star Schema,
Snowflake, Data Vault, Oracle Designer and modeling tools like Erwin and Visio,SCD,Surrogate
keys, Normalization/De normalization.

 Analysis, estimations, Design, Construction, Unit and System Testing and implementation.
 Extensive Knowledge in designing functional and detailed design documents for data
warehouse development
 Experience in ActiveVOS workflow design, creation of Human task Extensive experience in MDM
10. x with IDQ and ActiveVOS 9. 2.
 Experience in managing ActiveVOS Central Setup Identify Service on ActiveVOS Console
 Good Knowledge on ActiveVOS Error and Fault handing, Event Handling, Gateway, Control Flow
 Understanding of Monitoring Services (Process, Task, Server) Schedule ActiveVOS Automatic
Processes
 Worked on IDQ tools for data profiling, data enrichment and standardization.
 Experience in development of mappings in IDQ to load the cleansed data into the target table
using various IDQ transformations. Experience in data profiling and analyzing the scorecards to
design the data model
 Experience on creating Column Profiling, Rule Profiling, Mid-stream Profiling, Score Cards,
Decision Task, Reference Tables, DQ Tables, Notification task as a part of workflow and
deploying them.
 Imported the IDQ address standardized mappings into Informatica Designer as a Mapplets
 Expertise in designing and developing Informatica Data Quality transformations/rules like
Address validator, Parser, Labeler, Match, Exception, Association, Standardizer and other
significant transformations
 Experience in Data Modeling, Star Schema, Snowflake Schema, Slowly Changing Dimensions,
Fact tables, Dimension tables, Normal forms, OLAP and OLTP systems.
 Good explore in Informatica DVO for test the data validation by write SQL queries source and
Target database and schedule tasks and reporting test result.
 Experience in the integration of various data sources such as Oracle, SQL Server, Sybase,
Teradata, DB2, XML Files and Flat files.
 Two B2B Projects – DT/DX (Data Transformation Studio), Data Transformation Accelerator, Data
Format Transformation Library, Data Transformation Engine and full integration with Power
center
 Design MDM architecture solutions, complex match rules, complex custom survivorship rules
which cannot be handled out-of-the box MDM tool to determine best version of truth (BVT) aka
Golden Record and reusable framework
 Experience in making API , web service calls to third party vendors by integrating with node js
and PCF .
 Developed E2E process for design and developing real time process to users to insert records
directly into MDM schema.
 Configure Base objects, Match Rules, Merge Rules, Trust Settings, Stage, Landing tables, Queries,
Packages, Validation rules, mappings, Hierarchies using Informatica MDM Multi-Domain Hub.
Build Informatica Data Director (IDD) UI application for data steward to strategize stewardship
functionality and Entity 360 view by providing governance on role based specific operations
 Vast experience in Designing and developing complex mappings from varied transformation
logic like Unconnected and Connected lookups, Source Qualifier, Router, Filter, Expression,
Aggregator, Joiner, Update Strategy, Data Transformation Services etc
 Extracted Data from multiple operational sources of loading staging area, Data Warehouse and
data marts using CDC/ SCD (Type1/Type2) loads.
 Experience in PL/SQL Programming (Stored procedures, Triggers, Packages) using Oracle and
have a good knowledge in Sybase.

Technical Summary:

Operating System UNIX, Linux, Windows


Programming and Scripting C, C++, Java,.Net, Perl Scripting, Shell Scripting, XSLT, PL/SQL, T-
SQL.
Specialist Applications & Informatica MDM(10.3,10.2,10.1),Informatica Power Center/10.1
Software 10/9.6.1/9.5/9.1/8.6/8.1/7.1, Informatica Power Exchange,
Metadata Manager, Informatica Data Quality (IDQ), Informatica
Data Director (IDD), Customer C360, MDM, SSIS, Salesforce,
DataStage, etc.
Data Modeling (working Relational Modeling, Dimensional Modeling (Star Schema, Snow-
knowledge) Flake, FACT, Dimensions), Physical, Logical Data Modeling, and ER
Diagrams.
Databases tools SQL Server Management Studio (2008), Oracle SQL Developer (3.0),
Toad 11.6 (Oracle), Teradata, AQT v9 (Advanced query tool)
(Oracle/Netezza), DB Artisan 9.0 (Sybase), SQL Browser (Oracle
Sybase), Visio, ERWIN
Scheduling tools Informatica Scheduler, CA Scheduler(Autosys), ESP, Autosys
Control-M,
Conversion/Transformation Informatica Data Transformation, Oxygen XML Editor (ver.9,
tools ver.10)
Software Development Agile, Water fall.
Methodology
Domain Expertise Publishing, Insurance/Finance, HealthCare
Others (working knowledge OBIEE RPD creation, OBIEE Reports, ECM, Informatica Data
on some) Transformation XMAP, DAC, Rational Clear Case, WS-FTP Pro, DTD.
RDBMS SQL, SQL*Plus, SQL*Loader, Oracle11g/10g/9i/8i, Teradata, IBM
DB2,MS SQL Server 2000/2005/2008, MS Access 7.0/2000.
Data Governance Tool Collibra
Work Experience:

Client: Travelers Indemnity Dec 2016 – Present


Sr.Informatica ETL/MDM Developer
Responsibilities:

 Worked with DBA’s and Informatics admins for server size estimation and informatica tool
installation.
 Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching,
data conversion.
 Worked extensively on Informatics mappings to load data from various sources to final target
tables based on the business logic.
 Designed web services applications for name standardization and for inserting records through
real time
 Create Realtime RESTFUL PUT, POST, GET method API calls called as Business Entity Services that
exposes a master entity out from MDM to enable realtime integration of application to MDM
using Provisioning Tool and also enable smart search using Solr Fuzzy Search.
 Configured Informatica Data Director(IDD); design workflow customizations for Data Stewards
 Configured administrative activities related to MDM platform including but not limited to MDM
hub, Process Server, Active VOS, Provisioning and IDD/C360 UI.
 Configured C360 view to display the data of the root node of the composite objects
 Configure near realtime capability, configure triggers and components against specific sources
and events to publish message queues to RabbitMQ to publish a Master record changed
message in json format
 Used Metadata manager for validating, promoting, importing and exporting repositories from
development environment to testing environment.
 Involved with Data Steward Team for designing, documenting and configuring Informatica Data
Director for supporting management of MDM data.
 Identified Golden Record (BVT) for Customer Data by analyzing the data and duplicate records
coming from different source systems.
 Helped the testing team for data integrity and consistency.
 Defined and configured schema, staging tables, and landing tables, base objects foreign-key
Relationships, look up systems and tables, packages, query groups and queries/custom queries.
 Defined the Trust and Validation rules and setting up the match/merge rule sets to get the right
master records
 Used Hierarchies tool, for configuring entity base objects, entity types, relationship base objects,
relationship types, profiles, put and display packages and used the entity types as subject areas
in IDD.
 Designed the Informatica master data C360 processes, C360 Model objects, Entities, etc
 Implemented the C360 product functionality to support MDM application design within SFDC
 Configured C360 application configuration for Customer objects
 Worked with offshore teams to design and develop/configure C360 functionality
 Created Jenkins and UCD process for automatic code deployment to higher environments
 Successfully implemented IDD using hierarchy configuration and created subject area groups,
subject areas, subject area child, IDD display packages in hub and search queries.
 Deployed new MDM Hub for portals in conjunction with user interface on IDD application.
 Defined the Trust and Validation rules and setting up the match/merge rule sets to get the right
master records.
 Worked on java script and node js to deploy applications in PCF to communicate to third party
vendors
 Extensively worked on E2E flow of data from IDQ to MDM and generating BVT using validation
rules and survivorship
 Created complex mappings for the retro activity requirements from client.
 Designed automation process to handle production data issue using scripts and sqlqueries,and
come up with the best approach to increase the performance of process
 Created webservice calls,restapi calls from Informatica developer to communicate with cloud
data and real time insert/update call to insert data into MDM directly
 Design trust settings, design java user exits, Security and Privileges along with roles and groups,
design data model for MDM by interacting with solution and portfolio architects to establish
common framework based solutions using Informatica MDM Multi-Domain Hub.
 Developed database triggers, functions, stored procedures, packages, Materialized Views and
Ref cursors using Oracle PLSQL
 Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating
transformations.
 Extensively worked on Labeler, Parser, Key Generator, Match, Merge, and Consolidation
transformations to identify the duplicate records.
 Good Debugging skills and good at using Exceptions and Error Handling.
 Worked with Shortcuts across Shared and Non Shared Folders
 Involved in implementing the Land Process of loading the customer/product Data Set into
Informatica MDM from various source systems.
 Performed land process to load data into landing tables of MDM Hub using external batch
processing for initial data load in hub store.
 Involved in deployment of IDQ mappings to application and to different environments
 Preparation of UTP (Unit Test Plan) with all required validations and test cases.
 Responsible for testing and validating the Informatica mappings against the pre-defined ETL
design standards.
 Created various tasks like sessions, decision, timer & control to design the workflows based on
dependencies
 Involved in production support.

Environment: Informatica Data Quality 10.2,Informatica Power Center 10.2,Informatica MDM 10.3, IDD,
C360,oracle 12g, MS SQL Server 2012, SQL, PL/SQL,Scripts,nodejs , PCF, java script,Jenkins,UCD.

UPMC- Pittsburgh DEC 2015-Nov2016


Informatica ETL/MDM Developer
Responsibilities:
 Involved in the complete end-to-end flow of SDLC.
 Working closely with business analysts and data analysts to understand and analyze the
requirement to come up with robust design and solutions.
 Working on IHAA (Informatica Healthcare Analytics Accelerator).
 Involved in requirement analysis, ETL design and development for extracting data from the
heterogeneous source systems like EPIC,Cerner,Medicare and loading into Staging and
Enterprise Data Vault
 Designed, documented and configured the Informatica MDM Hub to support loading, cleansing
of data.
 Involved in implementing the Land Process of loading the customer/product Data Set into
Informatica MDM from various source systems.
 Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
 Developed ETL programs using Informaticato implement the business requirements.
 Developed several IDQ complex mappings in Informatica a variety of Power Center,
transformations, Mapping Parameters, Mapping Variables, Mapplets& Parameter files in
Mapping Designer using Informatica Power Center.
 Experience in ActiveVOS workflow design, creation of Human task Extensive experience in MDM
10. x with IDQ and ActiveVOS 9. 2.
 Experience in managing ActiveVOS Central Setup Identify Service on ActiveVOS Console
 Worked on extracting data from the heterogeneous source systems like MS SQL Server, Oracle,
HL7 Files and loading into Landing Layer and then to Data Warehouse
 Involved in standardization of Data like of changing a reference data set to a new standard.
 Data if validated from third party before providing to the internal transformations should be
checked for its accuracy (DQ).
 Worked on Active batch jobs scheduling automation to monitor, and orchestrate automated
processes that span the data center, IT operations and infrastructure, business applications.
 Configure Base objects, Match Rules, Merge Rules, Trust Settings, Stage, Landing tables, Queries,
Packages, Validation rules, mappings, Hierarchies using Informatica MDM Multi-Domain Hub.
Build Informatica Data Director (IDD) UI application for data steward to strategize stewardship
functionality and Entity 360 view by providing governance on role based specific operations.
Build Autosys jobs and Unix shell
 Filtered XML claims files by using filter conditions on D9 segment and converted back the
filtered claims xml files to EDI format using serializer in B2B data transformation.
 Led B2B structured and unstructured transformations that included: resolving end user
problems, training on B2B transformations and resolving system failures.
● Modified the 834 file and then checked if the eligibility enrollment is properly loaded onto
Facets.
● Worked closely with different EDI transaction file like 837,834, 835 to understand the source
structure and the source data pattern.
● Created Pre/Post Session/SQL commands in sessions and mappings on the target instance.
● Responsible for creating Workflows and sessions using Informatica workflow manager and
monitor the workflow run and statistic properties on Informatica Workflow Monitor.
● Implemented various loads like Daily Loads, Weekly Loads, and Quarterly Loads using
Incremental Loading Strategy.
● Created Audit, Control and batch tables to monitor load process on user requests.
● Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
 Created profiles and score cards for the users using IDQ.
 Created Technical Specification Documents and Solution Design Documents to outline the
implementation plans for the requirements.
 Involved in testing of Stored Procedures and Functions, Unit and Integrating testing of
Informatica Sessions, Batches and the Target Data.
 Involved in massive data cleansing prior to data staging from flat files.
 Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load)
processes using Informatica Power Center 9.6 by using various transformations like Expression,
Source Qualifier, Filter, Router,Sorter, Aggregator, Update Strategy, Connected and
unconnected look up etc.
 Implemented slowly changing dimensions Type 2 using ETL Informatica tool.
 Created Informatica components required to operate Data Quality (Power Center required)
 Designed best practices on Process Sequence, Dictionaries, Data Quality Lifecycles, Naming
Convention, and Version Control.
 Created Use-Case Documents to explain and outline data behavior.
 Developed scripts for creating tables, views, synonyms and materialized views in the datamart.
 Involved in designing and developing logical and physical data models to best suit the
requirements.
 Working with Informatica Developer (IDQ) tool to ensure data quality to the consumers.
 Used Address validator transformation for validating various customers address from various
countries by using SOAP interface
 Converted business requirements into highly efficient, reusable and scalable Informatica ETL
processes.
 Created mapping documents to outline source-to-target mappings and explain business-driven
transformation rules.
 Data if sourced from database that has valid not null columns should not undergo DQ check for
completeness.
 Worked on T-SQL programming, stored procedures, user-defined functions, cursors, views setup
and management of linked servers to Non-SQL-Server databases for initial loads.
 Designed mappings to read data from various relational and file source systems such as SQL
Server, flat files and XML files.
Environment: Informatica Power Center 9.6.1,MDM 9.6.1,IDQ,B2B, DT/DXT-SQL, PLSQL, TOAD, AIX,
Windows NT/XP, XML file sources, ERWIN 4, MS Visio, Remedy.

Open Mind Technologies May2012–July2014


Software Developer
Responsibilities:
 Designed the ETL processes to migrate the existing ETL process from ODI (Oracle Data
Integrator) to Informatica power center 9.1 and to load data from Oracle, Teradata sources, flat
files to the Enterprise Data Warehouse.
 Extensively worked with the business and data analysts in requirements gathering and to
translate business requirements into technical specifications.
 Created source and Target Definitions, Reusable transformations, mapplets and worklets.
 Created Mappings and extensively used transformations like Source Qualifier, Filter, Update
Strategy, Lookup, Expression, Router, Joiner, Normalizer, Aggregator and Sequence Generator,
Web services Consumer.
 Built a re-usable staging area in Teradata for loading data from multiple source systems using
template tables for profiling and cleansing in IDQ
 Created profile and scorecards to review data quality.
 Actively involved in Data validations and unit testing to make sure data is clean and standardized
before loading in toMDM landing tables.
 Actively involved in Exceptional handling process in IDQ Exceptional transformation after loading
the data in MDM and notify the Data Stewards with all exceptions.
 Developed Source to Staging and Staging to Dimension mappings.
 Developed Source to Staging and Staging to Fact mappings
 Extracted the data from the flat files, CSV files and other RDBMS databases like Teradata and
Oracle into staging area and populated onto Data warehouse.
 Developed number of complex Informatica mappings, Mapplets, reusable transformations to
implement the business logic and to load the data from DW to Responses feeds engine(RFE).
 Worked on Hadoop Job Client utilities and integrated them into monitoring system.
 Extensively worked on trouble shooting the installation issues and admin console configuring
issues.
 Worked on Autosys scheduling tool for scheduling ETL jobs.
 Created sessions, configured workflows to extract data from various sources, transformed data,
and loading into data warehouse.
 Used Workflow manager for session management, database connection management and
scheduling of jobs to be run in the batch process.
 Monitor and troubleshoot batches and sessions for weekly and Monthly extracts from various
data sources across all platforms to the target database.
 Created data quality indicators for the bad source rows/records and assigned different numbers
to track down each source.
 Involved in Performance tuning at source, target, mappings, sessions, and system levels.
 End to end testing of data extracts that includes sanity checks, integration testing if multiple
extracts for single system, data/schema testing.
 Resolving gaps, questions, issues on the test extracts provided.
 Track and plan for future IT enhancements/changes/deliverables that will impact data extracts.
Environment: Informatica Power Center 8.1,IDQ, MDM ODI, Erwin, UNIX, Oracle, Teradata, Autosys.

You might also like